1 Description

This report is subset into 3 parts:

1.1 Clinical Context

Computed Tomography scanner (CT scan) is a widely spread and popular exam in oncology: it reflects the density of the tissues of the human body. It is, then, adapted to the study of lung cancer because lungs are mostly filled with air (low density) while tumors are made of dense tissues.

1.2 Clinical context

Small Cell Lung Cancer can itself be split into four major subtypes based on histology observations: squamous cell carcinoma, large cell carcinoma, adenocarcinoma and a mixture of all

1.3 Goal

Predict the survival time of a patient (remaining days to live) from one three-dimensional CT scan (grayscale image) and a set of pre-extracted quantitative imaging features, as well as clinical data.

1.4 dataset

To each patient corresponds one CT scan, and one binary segmentation mask. The segmentation mask is a binary volume of the same size as the CT scan, except that it is composed of zeroes everywhere there is no tumour, and 1 otherwise. The CT scans and the associated segmentation masks are subsets of two public datasets:

  • NSCLC Radiomics (subset of 285 patients)
  • NSCLC RadioGenomics(subset of 141 patients)

Both training and validation contain for each patient, the time to event (days), as well as the censorship. Censorship indicates whether the event (death) was observed or whether the patient escaped the study: this can happen when the patient’s track was lost, or if the patient died of causes not related to the disease.

2 Tumor Arrays Slides Exploration

2.1 Setting python version and anaconda environment for R :-)

reticulate::use_python("/Library/Frameworks/Python.framework/Versions/3.7/bin/python3", required = TRUE)
#reticulate::use_python("/Users/Mezhoud/anaconda3/bin/python3", required = TRUE)
reticulate::py_config()
## python:         /Library/Frameworks/Python.framework/Versions/3.7/bin/python3
## libpython:      /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/config-3.7m-darwin/libpython3.7.dylib
## pythonhome:     /Library/Frameworks/Python.framework/Versions/3.7:/Library/Frameworks/Python.framework/Versions/3.7
## version:        3.7.3 (v3.7.3:ef4ec6ed12, Mar 25 2019, 16:52:21)  [Clang 6.0 (clang-600.0.57)]
## numpy:          /Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/site-packages/numpy
## numpy_version:  1.18.1
## 
## NOTE: Python version was forced by use_python function
#pip3 install --user sklearn
knitr::opts_chunk$set(engine.path = list(
  python = '/Library/Frameworks/Python.framework/Versions/3.7/bin/python3'
))

# check xgboost module 
reticulate::py_module_available("seaborn")
## [1] TRUE
#reticulate::install_miniconda("xgboost")

2.2 Load scans and masks of Tumor lung cancer

import numpy as np
from matplotlib import pyplot as plt
#from matplotlib import pyplot
from PIL import Image

img_array = np.load('train/images/patient_002.npz')
scan = img_array['scan']
mask = img_array['mask']

print("the dimension of scan array is: ", str(scan.shape))
## the dimension of scan array is:  (92, 92, 92)
print("the dimension of mask array is: ", str(mask.shape))
## the dimension of mask array is:  (92, 92, 92)
print("plot some images from patient 002: ")
#plt.imshow(scan[:, :, 3])
## plot some images from patient 002:
f, axarr = plt.subplots(2,3)
axarr[0,0].imshow(scan[1:92, 1:92, 0])
axarr[1,0].imshow(mask[1:92, 1:92, 0])
axarr[0,1].imshow(scan[:, :, 3])
axarr[1,1].imshow(mask[:, :, 3])
axarr[0,2].imshow(scan[:, :, 80])
axarr[1,2].imshow(mask[:, :, 80])

2.2.1 Function to plot multiple image from array


def plot_figures(figures, nrows = 1, ncols=1):
  """Plot a dictionary of figures.

  Parameters
  ----------
  figures : <title, figure> dictionary
  ncols : number of columns of subplots wanted in the display
  nrows : number of rows of subplots wanted in the figure
  """
  fig, axeslist = plt.subplots(ncols=ncols, nrows=nrows)
  for ind,title in zip(range(len(figures)), figures):
      axeslist.ravel()[ind].imshow(figures[title], cmap=plt.jet())
      axeslist.ravel()[ind].set_title(title)
      axeslist.ravel()[ind].set_axis_off()
  plt.tight_layout() 


img_array = np.load('train/images/patient_002.npz')
scan = img_array['scan']
mask = img_array['mask']


# generation of a dictionary of (title, images)
number_of_im = 6
scan = {'scan'+str(i): scan[1:92, 1:92, i] for i in range(number_of_im)}

# plot of the images in a figure, with 5 rows and 4 columns
plot_figures(scan, 2, 3)
plt.show()

The plot shows colored images scan of 6 slides. At this step it is not easy to distinguish the tumor.

The dataset has aslo the masks for each scan slide which locate the position of the tumor in the scan.

mask = {'mask'+str(i): mask[1:92, 1:92, i] for i in range(number_of_im)}
# plot of the images in a figure, with 5 rows and 4 columns
plot_figures(mask, 2, 3)
plt.show()

  • The first 3 slides do not have tumor streak, however the next 3 ones indicate the position of the tumor in red color.

  • If we plot more slides, we can observe the increase of the size of the tumor during plotting slides.

  • At the end the size the Tumor is decreasing.

  • We can note that the crop is adjusted to the size of the tumor


img_array = np.load('train/images/patient_002.npz')
scan = img_array['scan']
mask = img_array['mask']

mask = {'mask'+str(i): mask[1:92, 1:92, i] for i in range(90)}
# plot of the images in a figure, with 5 rows and 4 columns
plot_figures(mask, 9, 10)
plt.show()

  • We can select the representative mask per patient by the sum of pixels (0,1). The matrix with higher value can be selected to represent owner patient.

If we compare with the scan slides, we obtain:

scan = {'scan'+str(i): scan[1:92, 1:92, i] for i in range(90)}
# plot of the images in a figure, with 5 rows and 4 columns
plot_figures(scan, 9, 10)
plt.show()

  • It is always not easy to delimit the tumor in scan images

  • Comparing to masks, we can note that, between scan 34 and scan 65, the slides have more yellow stain or less blue color.

2.2.2 Superimposing Scan and Mask images

import numpy as np
from matplotlib import pyplot as plt
from PIL import Image

img_array = np.load('train/images/patient_002.npz')
scan = img_array['scan']
mask = img_array['mask']


background = mask[1:92, 1:92, 56]
overlay = scan[1:92, 1:92, 56]

plt.title("Scan/Mask: 56")
plt.imshow(background, cmap='gray')
plt.imshow(overlay, cmap='jet', alpha=0.9)

  • It is now clear that masks seems to be more useful that scans because the tumor in not visible in scan slides.

2.3 Load images from test dataset

img_array = np.load('test/images/patient_001.npz')
scan = img_array['scan']
mask = img_array['mask']


# generation of a dictionary of (title, images)
number_of_im = 90
scan = {'scan'+str(i): scan[1:92, 1:92, i] for i in range(number_of_im)}

# plot of the images in a figure, with 5 rows and 4 columns
plot_figures(scan, 9, 10)
plt.show()

  • Plot mask slides from test dataset
mask = {'mask'+str(i): mask[1:92, 1:92, i] for i in range(number_of_im)}
# plot of the images in a figure, with 5 rows and 4 columns
plot_figures(mask, 9, 10)
plt.show()

  • Superimpose mask and scan slide from test dataset
img_array = np.load('test/images/patient_001.npz')
scan = img_array['scan']
mask = img_array['mask']

background = mask[1:92, 1:92, 35]
overlay = scan[1:92, 1:92, 35]

plt.title("Scan/Mask: 35")
plt.imshow(background, cmap='gray')
plt.imshow(overlay, cmap='jet', alpha=0.9)

  • In the test images, we can also observe tumor slides like in train dataset.

  • For training step, it maybe better to use masks slides than scan. But we need to explore variables in clinical data and radiomics and think how to associate images with numeric variables.

  • One think we can do is the convert slides to dataframe (each slide in one row) and then we can obtain one matrix for each patient tumor.

  • At this step I will switch from python to R :-)

2.4 Import image from python environment to R

The goal of this step is to convert image matrices as a vector. So, each image can be ranged in one row. Finally, we can obtain one dataframe with 92 rows (images) for each sample (patient).

2.4.1 Import useful R packages

2.4.1.1 Useful python function


import numpy as np

def load_img_array(file):
  im_array = np.load(file)
  scan = im_array['scan']
  mask = im_array['mask']
  return scan,mask

2.4.1.2 Understanding the structure of the array of images

patient_002 <- reticulate::py$load_img_array('train/images/patient_002.npz')

paste0("One image is a: ", class(patient_002[[1]][,,1]))
## [1] "One image is a: matrix"
paste0("Two images are an: ", class(patient_002[[1]][,,1:2]))
## [1] "Two images are an: array"
paste0("Print the first 10 pixels of Scan N°1: "); patient_002[[1]][,,1][1:10, 1:10]
## [1] "Print the first 10 pixels of Scan N°1: "
##       [,1] [,2] [,3] [,4] [,5] [,6] [,7] [,8] [,9] [,10]
##  [1,] -777 -759 -707 -697 -749 -796 -826 -837 -858  -860
##  [2,] -783 -791 -774 -768 -787 -808 -827 -826 -829  -829
##  [3,] -804 -841 -827 -812 -820 -840 -831 -801 -792  -794
##  [4,] -830 -857 -839 -816 -805 -818 -801 -764 -734  -722
##  [5,] -844 -854 -843 -823 -799 -787 -771 -743 -704  -670
##  [6,] -844 -849 -844 -831 -821 -810 -809 -791 -760  -719
##  [7,] -848 -847 -848 -844 -847 -841 -849 -841 -829  -806
##  [8,] -847 -856 -854 -846 -840 -813 -826 -849 -848  -836
##  [9,] -840 -851 -836 -823 -835 -796 -803 -853 -857  -833
## [10,] -847 -841 -829 -817 -860 -832 -799 -842 -865  -838
paste0("Print the first 10 pixels of Mask N°1: "); patient_002[[2]][,,3][1:10, 1:10]
## [1] "Print the first 10 pixels of Mask N°1: "
##        [,1]  [,2]  [,3]  [,4]  [,5]  [,6]  [,7]  [,8]  [,9] [,10]
##  [1,] FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##  [2,] FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##  [3,] FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##  [4,] FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##  [5,] FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##  [6,] FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##  [7,] FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##  [8,] FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##  [9,] FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
## [10,] FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE

2.4.1.3 Convert the array of matrices to a list of matrices

ls_scan_patient_002 <- lapply(seq(dim(patient_002[[1]])[3]), function(x) patient_002[[1]][ , , x])
ls_mask_patient_002 <- lapply(seq(dim(patient_002[[2]])[3]), function(x) patient_002[[2]][ , , x])

paste0("The dimension of the scan images is: ", length(ls_scan_patient_002))
## [1] "The dimension of the scan images is: 92"
paste0("The dimension of the mask images is: ", length(ls_mask_patient_002))
## [1] "The dimension of the mask images is: 92"

2.4.1.4 Convert image matrix to vector

mat2vec <- function(path, target = 'mask'){
  
  # Load patient CT scan
  patient <- py$load_img_array(path)
  
    #### For scans
  if('scan' %in% target){
  # list scans
  scan <- lapply(seq(dim(patient[[1]])[3]), function(x) patient[[1]][ , , x])
    # vectorize each matrix (image) into vector
  vec_scan <- lapply(scan, function(x) as.vector(x))
   # bind vector into dataframe by row
  df_scan <-as.data.frame( do.call(rbind, vec_scan)) 
   # extract patient_id from path
  scan_id <- paste0(tools::file_path_sans_ext(basename(path)), "_scan")
  
  }
    
  #### For Masks
  if('mask' %in% target){

  # list masks
  mask <- lapply(seq(dim(patient[[2]])[3]), function(x) patient[[2]][ , , x])
  # vectorise each mask (image) to vector
  vec_mask <- lapply(mask, function(x) as.vector(x))
  # bind vector into dataframe by row
  df_mask <-as.data.frame( do.call(rbind, vec_mask)) 
  
   # extract patient_id from path
  mask_id <- paste0(tools::file_path_sans_ext(basename(path)), "_mask")
  }
  
  if("scan" %in% target && "mask" %in% target){
  # group in list the scan and the mask dataframes
  ls <- list(df_scan, df_mask)

  # Rename list
  names(ls) <- c(scan_id, mask_id)
  
  return(ls)
  
  } else if(target == "scan"){
    
    names(df_scan) <- scan_id
    
    return(df_scan)
    
  }else{
    #ls <- list(df_mask)
    #names(ls) <- mask_id
    return(df_mask)
  }
}

patient2 <- mat2vec('train/images/patient_002.npz', target = c("mask", "scan"))

paste0("The output is a: ", class(patient2))
## [1] "The output is a: list"
paste0("With length of: ", length(patient2))
## [1] "With length of: 2"
paste0("The names of two elements are: ") ; names(patient2)
## [1] "The names of two elements are: "
## [1] "patient_002_scan" "patient_002_mask"
paste0("which are: ", class(patient2$patient_002_scan))
## [1] "which are: data.frame"
paste0("The dimension of each dataframe is: ") ; dim(patient2$patient_002_scan)
## [1] "The dimension of each dataframe is: "
## [1]   92 8464
  • We think to use only masks for modeling

  • Potential method: keras, mxnet

  • The training dataset must contain 92 rows corresponding to 92 masks of each patient. The first column is the Event (target) of each patient. The remain columns are the pixel intensity or in our case (0,1) tags which delimit the tumor if exists.

2.4.1.5 Select only the best mask for each Patient

We consider that the mask that has more TRUE is the best representative slide for patient. The idea is to convert all matrices to vectors and count the value TRUE for each row.

masks2 <- mat2vec('train/images/patient_002.npz', target = c("mask"))

## display the sum of TRUE in each row for Patient002
paste("Number of TRUE in each slide for Patient_002:") ; rowSums(masks2)
## [1] "Number of TRUE in each slide for Patient_002:"
##  [1]    0    0    0   55   60   60  195  209  209  504  548  548 1158 1256 1276
## [16] 1737 1809 1835 2353 2429 2474 2870 2920 2960 3225 3251 3280 3537 3567 3546
## [31] 3619 3619 3608 3558 3557 3557 3537 3534 3534 3664 3668 3671 3872 3911 3928
## [46] 3994 4025 4026 3818 3822 3806 3553 3492 3495 3552 3520 3503 2973 2913 2887
## [61] 2080 2024 1988 1479 1479 1460 1257 1293 1289 1175 1192 1199  663  679  674
## [76]  901  922  924  771  741  755  745  671  654  622  255  237  206    0    0
## [91]    0    0
get_best_mask <- function(masks){
# Display the biggest row with TRUE value
Case <- masks[sort(rowSums(masks), index=TRUE, decreasing=TRUE)$ix, ][1,]
#d <- cbind(`Patients` = as.character(names(masks)),Case)
return(Case)
}

masks2[sort(rowSums(masks2), index=TRUE, decreasing=TRUE)$ix, ][1,]
##       V1    V2    V3    V4    V5    V6    V7    V8    V9   V10   V11   V12
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##      V13   V14   V15   V16   V17   V18   V19   V20   V21   V22   V23   V24
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##      V25   V26   V27   V28   V29   V30   V31   V32   V33   V34   V35   V36
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##      V37   V38   V39   V40   V41   V42   V43   V44   V45   V46   V47   V48
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##      V49   V50   V51   V52   V53   V54   V55   V56   V57   V58   V59   V60
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##      V61   V62   V63   V64   V65   V66   V67   V68   V69   V70   V71   V72
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##      V73   V74   V75   V76   V77   V78   V79   V80   V81   V82   V83   V84
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##      V85   V86   V87   V88   V89   V90   V91   V92   V93   V94   V95   V96
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##      V97   V98   V99  V100  V101  V102  V103  V104  V105  V106  V107  V108
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V109  V110  V111  V112  V113  V114  V115  V116  V117  V118  V119  V120
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V121  V122  V123  V124  V125  V126  V127  V128  V129  V130  V131  V132
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V133  V134  V135  V136  V137  V138  V139  V140  V141  V142  V143  V144
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V145  V146  V147  V148  V149  V150  V151  V152  V153  V154  V155  V156
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V157  V158  V159  V160  V161  V162  V163  V164  V165  V166  V167  V168
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V169  V170  V171  V172  V173  V174  V175  V176  V177  V178  V179  V180
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V181  V182  V183  V184  V185  V186  V187  V188  V189  V190  V191  V192
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V193  V194  V195  V196  V197  V198  V199  V200  V201  V202  V203  V204
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V205  V206  V207  V208  V209  V210  V211  V212  V213  V214  V215  V216
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V217  V218  V219  V220  V221  V222  V223  V224  V225  V226  V227  V228
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V229  V230  V231  V232  V233  V234  V235  V236  V237  V238  V239  V240
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V241  V242  V243  V244  V245  V246  V247  V248  V249  V250  V251  V252
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V253  V254  V255  V256  V257  V258  V259  V260  V261  V262  V263  V264
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V265  V266  V267  V268  V269  V270  V271  V272  V273  V274  V275  V276
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V277  V278  V279  V280  V281  V282  V283  V284  V285  V286  V287  V288
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V289  V290  V291  V292  V293  V294  V295  V296  V297  V298  V299  V300
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V301  V302  V303  V304  V305  V306  V307  V308  V309  V310  V311  V312
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V313  V314  V315  V316  V317  V318  V319  V320  V321  V322  V323  V324
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V325  V326  V327  V328  V329  V330  V331  V332  V333  V334  V335  V336
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V337  V338  V339  V340  V341  V342  V343  V344  V345  V346  V347  V348
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V349  V350  V351  V352  V353  V354  V355  V356  V357  V358  V359  V360
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V361  V362  V363  V364  V365  V366  V367  V368  V369  V370  V371  V372
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V373  V374  V375  V376  V377  V378  V379  V380  V381  V382  V383  V384
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V385  V386  V387  V388  V389  V390  V391  V392  V393  V394  V395  V396
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V397  V398  V399  V400  V401  V402  V403  V404  V405  V406  V407  V408
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V409  V410  V411  V412  V413  V414  V415  V416  V417  V418  V419  V420
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V421  V422  V423  V424  V425  V426  V427  V428  V429  V430  V431  V432
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V433  V434  V435  V436  V437  V438  V439  V440  V441  V442  V443  V444
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V445  V446  V447  V448  V449  V450  V451  V452  V453  V454  V455  V456
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V457  V458  V459  V460  V461  V462  V463  V464  V465  V466  V467  V468
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V469  V470  V471  V472  V473  V474  V475  V476  V477  V478  V479  V480
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V481  V482  V483  V484  V485  V486  V487  V488  V489  V490  V491  V492
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V493  V494  V495  V496  V497  V498  V499  V500  V501  V502  V503  V504
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V505  V506  V507  V508  V509  V510  V511  V512  V513  V514  V515  V516
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V517  V518  V519  V520  V521  V522  V523  V524  V525  V526  V527  V528
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V529  V530  V531  V532  V533  V534  V535  V536  V537  V538  V539  V540
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V541  V542  V543  V544  V545  V546  V547  V548  V549  V550  V551  V552
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V553  V554  V555  V556  V557  V558  V559  V560  V561  V562  V563  V564
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V565  V566  V567  V568  V569  V570  V571  V572  V573  V574  V575  V576
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V577  V578  V579  V580  V581  V582  V583  V584  V585  V586  V587  V588
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V589  V590  V591  V592 V593 V594 V595 V596 V597  V598  V599  V600  V601
## 48 FALSE FALSE FALSE FALSE TRUE TRUE TRUE TRUE TRUE FALSE FALSE FALSE FALSE
##     V602  V603  V604  V605  V606  V607  V608  V609  V610  V611  V612  V613
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V614  V615  V616  V617  V618  V619  V620  V621  V622  V623  V624  V625
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V626  V627  V628  V629  V630  V631  V632  V633  V634  V635  V636  V637
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V638  V639  V640  V641  V642  V643  V644  V645  V646  V647  V648  V649
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V650  V651  V652  V653  V654  V655  V656  V657  V658  V659  V660  V661
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V662  V663  V664  V665  V666  V667  V668  V669  V670  V671  V672  V673
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V674  V675  V676  V677  V678  V679  V680  V681  V682  V683 V684 V685 V686
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE TRUE TRUE TRUE
##    V687 V688 V689 V690 V691 V692 V693 V694 V695  V696  V697  V698  V699  V700
## 48 TRUE TRUE TRUE TRUE TRUE TRUE TRUE TRUE TRUE FALSE FALSE FALSE FALSE FALSE
##     V701  V702  V703  V704  V705  V706  V707  V708  V709  V710  V711  V712
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V713  V714  V715  V716  V717  V718  V719  V720  V721  V722  V723  V724
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V725  V726  V727  V728  V729  V730  V731  V732  V733  V734  V735  V736
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V737  V738  V739  V740  V741  V742  V743  V744  V745  V746  V747  V748
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V749  V750  V751  V752  V753  V754  V755  V756  V757  V758  V759  V760
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V761  V762  V763  V764  V765  V766  V767  V768  V769  V770  V771  V772
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V773  V774  V775 V776 V777 V778 V779 V780 V781 V782 V783 V784 V785 V786
## 48 FALSE FALSE FALSE TRUE TRUE TRUE TRUE TRUE TRUE TRUE TRUE TRUE TRUE TRUE
##    V787 V788 V789  V790  V791  V792  V793  V794  V795  V796  V797  V798  V799
## 48 TRUE TRUE TRUE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V800  V801  V802  V803  V804  V805  V806  V807  V808  V809  V810  V811
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V812  V813  V814  V815  V816  V817  V818  V819  V820  V821  V822  V823
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V824  V825  V826  V827  V828  V829  V830  V831  V832  V833  V834  V835
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V836  V837  V838  V839  V840  V841  V842  V843  V844  V845  V846  V847
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V848  V849  V850  V851  V852  V853  V854  V855  V856  V857  V858  V859
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V860  V861  V862  V863  V864  V865  V866 V867 V868 V869 V870 V871 V872 V873
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE TRUE TRUE TRUE TRUE TRUE TRUE TRUE
##    V874 V875 V876 V877 V878 V879 V880 V881 V882 V883 V884 V885  V886  V887
## 48 TRUE TRUE TRUE TRUE TRUE TRUE TRUE TRUE TRUE TRUE TRUE TRUE FALSE FALSE
##     V888  V889  V890  V891  V892  V893  V894  V895  V896  V897  V898  V899
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V900  V901  V902  V903  V904  V905  V906  V907  V908  V909  V910  V911
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V912  V913  V914  V915  V916  V917  V918  V919  V920  V921  V922  V923
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V924  V925  V926  V927  V928  V929  V930  V931  V932  V933  V934  V935
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V936  V937  V938  V939  V940  V941  V942  V943  V944  V945  V946  V947
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V948  V949  V950  V951  V952  V953  V954  V955  V956  V957  V958 V959 V960
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE TRUE TRUE
##    V961 V962 V963 V964 V965 V966 V967 V968 V969 V970 V971 V972 V973 V974 V975
## 48 TRUE TRUE TRUE TRUE TRUE TRUE TRUE TRUE TRUE TRUE TRUE TRUE TRUE TRUE TRUE
##    V976 V977 V978 V979  V980  V981  V982  V983  V984  V985  V986  V987  V988
## 48 TRUE TRUE TRUE TRUE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##     V989  V990  V991  V992  V993  V994  V995  V996  V997  V998  V999 V1000
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1001 V1002 V1003 V1004 V1005 V1006 V1007 V1008 V1009 V1010 V1011 V1012
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1013 V1014 V1015 V1016 V1017 V1018 V1019 V1020 V1021 V1022 V1023 V1024
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1025 V1026 V1027 V1028 V1029 V1030 V1031 V1032 V1033 V1034 V1035 V1036
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1037 V1038 V1039 V1040 V1041 V1042 V1043 V1044 V1045 V1046 V1047 V1048
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1049 V1050 V1051 V1052 V1053 V1054 V1055 V1056 V1057 V1058 V1059 V1060
## 48 FALSE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V1061 V1062 V1063 V1064 V1065 V1066 V1067 V1068 V1069 V1070 V1071 V1072
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V1073 V1074 V1075 V1076 V1077 V1078 V1079 V1080 V1081 V1082 V1083 V1084
## 48  TRUE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1085 V1086 V1087 V1088 V1089 V1090 V1091 V1092 V1093 V1094 V1095 V1096
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1097 V1098 V1099 V1100 V1101 V1102 V1103 V1104 V1105 V1106 V1107 V1108
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1109 V1110 V1111 V1112 V1113 V1114 V1115 V1116 V1117 V1118 V1119 V1120
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1121 V1122 V1123 V1124 V1125 V1126 V1127 V1128 V1129 V1130 V1131 V1132
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1133 V1134 V1135 V1136 V1137 V1138 V1139 V1140 V1141 V1142 V1143 V1144
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE  TRUE  TRUE  TRUE  TRUE
##    V1145 V1146 V1147 V1148 V1149 V1150 V1151 V1152 V1153 V1154 V1155 V1156
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V1157 V1158 V1159 V1160 V1161 V1162 V1163 V1164 V1165 V1166 V1167 V1168
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V1169 V1170 V1171 V1172 V1173 V1174 V1175 V1176 V1177 V1178 V1179 V1180
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1181 V1182 V1183 V1184 V1185 V1186 V1187 V1188 V1189 V1190 V1191 V1192
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1193 V1194 V1195 V1196 V1197 V1198 V1199 V1200 V1201 V1202 V1203 V1204
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1205 V1206 V1207 V1208 V1209 V1210 V1211 V1212 V1213 V1214 V1215 V1216
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1217 V1218 V1219 V1220 V1221 V1222 V1223 V1224 V1225 V1226 V1227 V1228
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1229 V1230 V1231 V1232 V1233 V1234 V1235 V1236 V1237 V1238 V1239 V1240
## 48 FALSE FALSE FALSE FALSE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V1241 V1242 V1243 V1244 V1245 V1246 V1247 V1248 V1249 V1250 V1251 V1252
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V1253 V1254 V1255 V1256 V1257 V1258 V1259 V1260 V1261 V1262 V1263 V1264
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE FALSE FALSE
##    V1265 V1266 V1267 V1268 V1269 V1270 V1271 V1272 V1273 V1274 V1275 V1276
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1277 V1278 V1279 V1280 V1281 V1282 V1283 V1284 V1285 V1286 V1287 V1288
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1289 V1290 V1291 V1292 V1293 V1294 V1295 V1296 V1297 V1298 V1299 V1300
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1301 V1302 V1303 V1304 V1305 V1306 V1307 V1308 V1309 V1310 V1311 V1312
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1313 V1314 V1315 V1316 V1317 V1318 V1319 V1320 V1321 V1322 V1323 V1324
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE  TRUE
##    V1325 V1326 V1327 V1328 V1329 V1330 V1331 V1332 V1333 V1334 V1335 V1336
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V1337 V1338 V1339 V1340 V1341 V1342 V1343 V1344 V1345 V1346 V1347 V1348
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V1349 V1350 V1351 V1352 V1353 V1354 V1355 V1356 V1357 V1358 V1359 V1360
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE FALSE FALSE FALSE FALSE
##    V1361 V1362 V1363 V1364 V1365 V1366 V1367 V1368 V1369 V1370 V1371 V1372
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1373 V1374 V1375 V1376 V1377 V1378 V1379 V1380 V1381 V1382 V1383 V1384
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1385 V1386 V1387 V1388 V1389 V1390 V1391 V1392 V1393 V1394 V1395 V1396
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1397 V1398 V1399 V1400 V1401 V1402 V1403 V1404 V1405 V1406 V1407 V1408
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1409 V1410 V1411 V1412 V1413 V1414 V1415 V1416 V1417 V1418 V1419 V1420
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V1421 V1422 V1423 V1424 V1425 V1426 V1427 V1428 V1429 V1430 V1431 V1432
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V1433 V1434 V1435 V1436 V1437 V1438 V1439 V1440 V1441 V1442 V1443 V1444
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V1445 V1446 V1447 V1448 V1449 V1450 V1451 V1452 V1453 V1454 V1455 V1456
## 48  TRUE  TRUE  TRUE  TRUE  TRUE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1457 V1458 V1459 V1460 V1461 V1462 V1463 V1464 V1465 V1466 V1467 V1468
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1469 V1470 V1471 V1472 V1473 V1474 V1475 V1476 V1477 V1478 V1479 V1480
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1481 V1482 V1483 V1484 V1485 V1486 V1487 V1488 V1489 V1490 V1491 V1492
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1493 V1494 V1495 V1496 V1497 V1498 V1499 V1500 V1501 V1502 V1503 V1504
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1505 V1506 V1507 V1508 V1509 V1510 V1511 V1512 V1513 V1514 V1515 V1516
## 48 FALSE FALSE FALSE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V1517 V1518 V1519 V1520 V1521 V1522 V1523 V1524 V1525 V1526 V1527 V1528
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V1529 V1530 V1531 V1532 V1533 V1534 V1535 V1536 V1537 V1538 V1539 V1540
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V1541 V1542 V1543 V1544 V1545 V1546 V1547 V1548 V1549 V1550 V1551 V1552
## 48  TRUE  TRUE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1553 V1554 V1555 V1556 V1557 V1558 V1559 V1560 V1561 V1562 V1563 V1564
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1565 V1566 V1567 V1568 V1569 V1570 V1571 V1572 V1573 V1574 V1575 V1576
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1577 V1578 V1579 V1580 V1581 V1582 V1583 V1584 V1585 V1586 V1587 V1588
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1589 V1590 V1591 V1592 V1593 V1594 V1595 V1596 V1597 V1598 V1599 V1600
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE  TRUE  TRUE
##    V1601 V1602 V1603 V1604 V1605 V1606 V1607 V1608 V1609 V1610 V1611 V1612
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V1613 V1614 V1615 V1616 V1617 V1618 V1619 V1620 V1621 V1622 V1623 V1624
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V1625 V1626 V1627 V1628 V1629 V1630 V1631 V1632 V1633 V1634 V1635 V1636
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V1637 V1638 V1639 V1640 V1641 V1642 V1643 V1644 V1645 V1646 V1647 V1648
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1649 V1650 V1651 V1652 V1653 V1654 V1655 V1656 V1657 V1658 V1659 V1660
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1661 V1662 V1663 V1664 V1665 V1666 V1667 V1668 V1669 V1670 V1671 V1672
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1673 V1674 V1675 V1676 V1677 V1678 V1679 V1680 V1681 V1682 V1683 V1684
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1685 V1686 V1687 V1688 V1689 V1690 V1691 V1692 V1693 V1694 V1695 V1696
## 48 FALSE FALSE FALSE FALSE FALSE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V1697 V1698 V1699 V1700 V1701 V1702 V1703 V1704 V1705 V1706 V1707 V1708
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V1709 V1710 V1711 V1712 V1713 V1714 V1715 V1716 V1717 V1718 V1719 V1720
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V1721 V1722 V1723 V1724 V1725 V1726 V1727 V1728 V1729 V1730 V1731 V1732
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE FALSE FALSE FALSE
##    V1733 V1734 V1735 V1736 V1737 V1738 V1739 V1740 V1741 V1742 V1743 V1744
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1745 V1746 V1747 V1748 V1749 V1750 V1751 V1752 V1753 V1754 V1755 V1756
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1757 V1758 V1759 V1760 V1761 V1762 V1763 V1764 V1765 V1766 V1767 V1768
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1769 V1770 V1771 V1772 V1773 V1774 V1775 V1776 V1777 V1778 V1779 V1780
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1781 V1782 V1783 V1784 V1785 V1786 V1787 V1788 V1789 V1790 V1791 V1792
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V1793 V1794 V1795 V1796 V1797 V1798 V1799 V1800 V1801 V1802 V1803 V1804
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V1805 V1806 V1807 V1808 V1809 V1810 V1811 V1812 V1813 V1814 V1815 V1816
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V1817 V1818 V1819 V1820 V1821 V1822 V1823 V1824 V1825 V1826 V1827 V1828
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1829 V1830 V1831 V1832 V1833 V1834 V1835 V1836 V1837 V1838 V1839 V1840
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1841 V1842 V1843 V1844 V1845 V1846 V1847 V1848 V1849 V1850 V1851 V1852
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1853 V1854 V1855 V1856 V1857 V1858 V1859 V1860 V1861 V1862 V1863 V1864
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1865 V1866 V1867 V1868 V1869 V1870 V1871 V1872 V1873 V1874 V1875 V1876
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V1877 V1878 V1879 V1880 V1881 V1882 V1883 V1884 V1885 V1886 V1887 V1888
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V1889 V1890 V1891 V1892 V1893 V1894 V1895 V1896 V1897 V1898 V1899 V1900
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V1901 V1902 V1903 V1904 V1905 V1906 V1907 V1908 V1909 V1910 V1911 V1912
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V1913 V1914 V1915 V1916 V1917 V1918 V1919 V1920 V1921 V1922 V1923 V1924
## 48  TRUE  TRUE  TRUE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1925 V1926 V1927 V1928 V1929 V1930 V1931 V1932 V1933 V1934 V1935 V1936
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1937 V1938 V1939 V1940 V1941 V1942 V1943 V1944 V1945 V1946 V1947 V1948
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1949 V1950 V1951 V1952 V1953 V1954 V1955 V1956 V1957 V1958 V1959 V1960
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V1961 V1962 V1963 V1964 V1965 V1966 V1967 V1968 V1969 V1970 V1971 V1972
## 48 FALSE FALSE FALSE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V1973 V1974 V1975 V1976 V1977 V1978 V1979 V1980 V1981 V1982 V1983 V1984
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V1985 V1986 V1987 V1988 V1989 V1990 V1991 V1992 V1993 V1994 V1995 V1996
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V1997 V1998 V1999 V2000 V2001 V2002 V2003 V2004 V2005 V2006 V2007 V2008
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE FALSE
##    V2009 V2010 V2011 V2012 V2013 V2014 V2015 V2016 V2017 V2018 V2019 V2020
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V2021 V2022 V2023 V2024 V2025 V2026 V2027 V2028 V2029 V2030 V2031 V2032
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V2033 V2034 V2035 V2036 V2037 V2038 V2039 V2040 V2041 V2042 V2043 V2044
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V2045 V2046 V2047 V2048 V2049 V2050 V2051 V2052 V2053 V2054 V2055 V2056
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE  TRUE  TRUE
##    V2057 V2058 V2059 V2060 V2061 V2062 V2063 V2064 V2065 V2066 V2067 V2068
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V2069 V2070 V2071 V2072 V2073 V2074 V2075 V2076 V2077 V2078 V2079 V2080
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V2081 V2082 V2083 V2084 V2085 V2086 V2087 V2088 V2089 V2090 V2091 V2092
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V2093 V2094 V2095 V2096 V2097 V2098 V2099 V2100 V2101 V2102 V2103 V2104
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE FALSE FALSE FALSE FALSE
##    V2105 V2106 V2107 V2108 V2109 V2110 V2111 V2112 V2113 V2114 V2115 V2116
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V2117 V2118 V2119 V2120 V2121 V2122 V2123 V2124 V2125 V2126 V2127 V2128
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V2129 V2130 V2131 V2132 V2133 V2134 V2135 V2136 V2137 V2138 V2139 V2140
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V2141 V2142 V2143 V2144 V2145 V2146 V2147 V2148 V2149 V2150 V2151 V2152
## 48 FALSE FALSE FALSE FALSE FALSE FALSE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V2153 V2154 V2155 V2156 V2157 V2158 V2159 V2160 V2161 V2162 V2163 V2164
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V2165 V2166 V2167 V2168 V2169 V2170 V2171 V2172 V2173 V2174 V2175 V2176
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V2177 V2178 V2179 V2180 V2181 V2182 V2183 V2184 V2185 V2186 V2187 V2188
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V2189 V2190 V2191 V2192 V2193 V2194 V2195 V2196 V2197 V2198 V2199 V2200
## 48  TRUE  TRUE  TRUE  TRUE  TRUE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V2201 V2202 V2203 V2204 V2205 V2206 V2207 V2208 V2209 V2210 V2211 V2212
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V2213 V2214 V2215 V2216 V2217 V2218 V2219 V2220 V2221 V2222 V2223 V2224
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V2225 V2226 V2227 V2228 V2229 V2230 V2231 V2232 V2233 V2234 V2235 V2236
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V2237 V2238 V2239 V2240 V2241 V2242 V2243 V2244 V2245 V2246 V2247 V2248
## 48 FALSE FALSE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V2249 V2250 V2251 V2252 V2253 V2254 V2255 V2256 V2257 V2258 V2259 V2260
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V2261 V2262 V2263 V2264 V2265 V2266 V2267 V2268 V2269 V2270 V2271 V2272
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V2273 V2274 V2275 V2276 V2277 V2278 V2279 V2280 V2281 V2282 V2283 V2284
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V2285 V2286 V2287 V2288 V2289 V2290 V2291 V2292 V2293 V2294 V2295 V2296
## 48  TRUE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V2297 V2298 V2299 V2300 V2301 V2302 V2303 V2304 V2305 V2306 V2307 V2308
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V2309 V2310 V2311 V2312 V2313 V2314 V2315 V2316 V2317 V2318 V2319 V2320
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V2321 V2322 V2323 V2324 V2325 V2326 V2327 V2328 V2329 V2330 V2331 V2332
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE  TRUE  TRUE
##    V2333 V2334 V2335 V2336 V2337 V2338 V2339 V2340 V2341 V2342 V2343 V2344
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V2345 V2346 V2347 V2348 V2349 V2350 V2351 V2352 V2353 V2354 V2355 V2356
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V2357 V2358 V2359 V2360 V2361 V2362 V2363 V2364 V2365 V2366 V2367 V2368
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V2369 V2370 V2371 V2372 V2373 V2374 V2375 V2376 V2377 V2378 V2379 V2380
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE FALSE FALSE
##    V2381 V2382 V2383 V2384 V2385 V2386 V2387 V2388 V2389 V2390 V2391 V2392
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V2393 V2394 V2395 V2396 V2397 V2398 V2399 V2400 V2401 V2402 V2403 V2404
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V2405 V2406 V2407 V2408 V2409 V2410 V2411 V2412 V2413 V2414 V2415 V2416
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V2417 V2418 V2419 V2420 V2421 V2422 V2423 V2424 V2425 V2426 V2427 V2428
## 48 FALSE FALSE FALSE FALSE FALSE FALSE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V2429 V2430 V2431 V2432 V2433 V2434 V2435 V2436 V2437 V2438 V2439 V2440
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V2441 V2442 V2443 V2444 V2445 V2446 V2447 V2448 V2449 V2450 V2451 V2452
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V2453 V2454 V2455 V2456 V2457 V2458 V2459 V2460 V2461 V2462 V2463 V2464
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V2465 V2466 V2467 V2468 V2469 V2470 V2471 V2472 V2473 V2474 V2475 V2476
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE FALSE FALSE FALSE FALSE FALSE
##    V2477 V2478 V2479 V2480 V2481 V2482 V2483 V2484 V2485 V2486 V2487 V2488
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V2489 V2490 V2491 V2492 V2493 V2494 V2495 V2496 V2497 V2498 V2499 V2500
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V2501 V2502 V2503 V2504 V2505 V2506 V2507 V2508 V2509 V2510 V2511 V2512
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V2513 V2514 V2515 V2516 V2517 V2518 V2519 V2520 V2521 V2522 V2523 V2524
## 48 FALSE FALSE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V2525 V2526 V2527 V2528 V2529 V2530 V2531 V2532 V2533 V2534 V2535 V2536
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V2537 V2538 V2539 V2540 V2541 V2542 V2543 V2544 V2545 V2546 V2547 V2548
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V2549 V2550 V2551 V2552 V2553 V2554 V2555 V2556 V2557 V2558 V2559 V2560
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V2561 V2562 V2563 V2564 V2565 V2566 V2567 V2568 V2569 V2570 V2571 V2572
## 48  TRUE  TRUE  TRUE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V2573 V2574 V2575 V2576 V2577 V2578 V2579 V2580 V2581 V2582 V2583 V2584
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V2585 V2586 V2587 V2588 V2589 V2590 V2591 V2592 V2593 V2594 V2595 V2596
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V2597 V2598 V2599 V2600 V2601 V2602 V2603 V2604 V2605 V2606 V2607 V2608
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE  TRUE  TRUE
##    V2609 V2610 V2611 V2612 V2613 V2614 V2615 V2616 V2617 V2618 V2619 V2620
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V2621 V2622 V2623 V2624 V2625 V2626 V2627 V2628 V2629 V2630 V2631 V2632
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V2633 V2634 V2635 V2636 V2637 V2638 V2639 V2640 V2641 V2642 V2643 V2644
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V2645 V2646 V2647 V2648 V2649 V2650 V2651 V2652 V2653 V2654 V2655 V2656
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE FALSE
##    V2657 V2658 V2659 V2660 V2661 V2662 V2663 V2664 V2665 V2666 V2667 V2668
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V2669 V2670 V2671 V2672 V2673 V2674 V2675 V2676 V2677 V2678 V2679 V2680
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V2681 V2682 V2683 V2684 V2685 V2686 V2687 V2688 V2689 V2690 V2691 V2692
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V2693 V2694 V2695 V2696 V2697 V2698 V2699 V2700 V2701 V2702 V2703 V2704
## 48 FALSE FALSE FALSE FALSE FALSE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V2705 V2706 V2707 V2708 V2709 V2710 V2711 V2712 V2713 V2714 V2715 V2716
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V2717 V2718 V2719 V2720 V2721 V2722 V2723 V2724 V2725 V2726 V2727 V2728
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V2729 V2730 V2731 V2732 V2733 V2734 V2735 V2736 V2737 V2738 V2739 V2740
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V2741 V2742 V2743 V2744 V2745 V2746 V2747 V2748 V2749 V2750 V2751 V2752
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE FALSE FALSE FALSE FALSE
##    V2753 V2754 V2755 V2756 V2757 V2758 V2759 V2760 V2761 V2762 V2763 V2764
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V2765 V2766 V2767 V2768 V2769 V2770 V2771 V2772 V2773 V2774 V2775 V2776
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V2777 V2778 V2779 V2780 V2781 V2782 V2783 V2784 V2785 V2786 V2787 V2788
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V2789 V2790 V2791 V2792 V2793 V2794 V2795 V2796 V2797 V2798 V2799 V2800
## 48 FALSE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V2801 V2802 V2803 V2804 V2805 V2806 V2807 V2808 V2809 V2810 V2811 V2812
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V2813 V2814 V2815 V2816 V2817 V2818 V2819 V2820 V2821 V2822 V2823 V2824
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V2825 V2826 V2827 V2828 V2829 V2830 V2831 V2832 V2833 V2834 V2835 V2836
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V2837 V2838 V2839 V2840 V2841 V2842 V2843 V2844 V2845 V2846 V2847 V2848
## 48  TRUE  TRUE  TRUE  TRUE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V2849 V2850 V2851 V2852 V2853 V2854 V2855 V2856 V2857 V2858 V2859 V2860
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V2861 V2862 V2863 V2864 V2865 V2866 V2867 V2868 V2869 V2870 V2871 V2872
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V2873 V2874 V2875 V2876 V2877 V2878 V2879 V2880 V2881 V2882 V2883 V2884
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE  TRUE  TRUE  TRUE
##    V2885 V2886 V2887 V2888 V2889 V2890 V2891 V2892 V2893 V2894 V2895 V2896
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V2897 V2898 V2899 V2900 V2901 V2902 V2903 V2904 V2905 V2906 V2907 V2908
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V2909 V2910 V2911 V2912 V2913 V2914 V2915 V2916 V2917 V2918 V2919 V2920
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V2921 V2922 V2923 V2924 V2925 V2926 V2927 V2928 V2929 V2930 V2931 V2932
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V2933 V2934 V2935 V2936 V2937 V2938 V2939 V2940 V2941 V2942 V2943 V2944
## 48  TRUE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V2945 V2946 V2947 V2948 V2949 V2950 V2951 V2952 V2953 V2954 V2955 V2956
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V2957 V2958 V2959 V2960 V2961 V2962 V2963 V2964 V2965 V2966 V2967 V2968
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V2969 V2970 V2971 V2972 V2973 V2974 V2975 V2976 V2977 V2978 V2979 V2980
## 48 FALSE FALSE FALSE FALSE FALSE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V2981 V2982 V2983 V2984 V2985 V2986 V2987 V2988 V2989 V2990 V2991 V2992
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V2993 V2994 V2995 V2996 V2997 V2998 V2999 V3000 V3001 V3002 V3003 V3004
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3005 V3006 V3007 V3008 V3009 V3010 V3011 V3012 V3013 V3014 V3015 V3016
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3017 V3018 V3019 V3020 V3021 V3022 V3023 V3024 V3025 V3026 V3027 V3028
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE FALSE FALSE FALSE
##    V3029 V3030 V3031 V3032 V3033 V3034 V3035 V3036 V3037 V3038 V3039 V3040
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V3041 V3042 V3043 V3044 V3045 V3046 V3047 V3048 V3049 V3050 V3051 V3052
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V3053 V3054 V3055 V3056 V3057 V3058 V3059 V3060 V3061 V3062 V3063 V3064
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V3065 V3066 V3067 V3068 V3069 V3070 V3071 V3072 V3073 V3074 V3075 V3076
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3077 V3078 V3079 V3080 V3081 V3082 V3083 V3084 V3085 V3086 V3087 V3088
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3089 V3090 V3091 V3092 V3093 V3094 V3095 V3096 V3097 V3098 V3099 V3100
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3101 V3102 V3103 V3104 V3105 V3106 V3107 V3108 V3109 V3110 V3111 V3112
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3113 V3114 V3115 V3116 V3117 V3118 V3119 V3120 V3121 V3122 V3123 V3124
## 48  TRUE  TRUE  TRUE  TRUE  TRUE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V3125 V3126 V3127 V3128 V3129 V3130 V3131 V3132 V3133 V3134 V3135 V3136
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V3137 V3138 V3139 V3140 V3141 V3142 V3143 V3144 V3145 V3146 V3147 V3148
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V3149 V3150 V3151 V3152 V3153 V3154 V3155 V3156 V3157 V3158 V3159 V3160
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3161 V3162 V3163 V3164 V3165 V3166 V3167 V3168 V3169 V3170 V3171 V3172
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3173 V3174 V3175 V3176 V3177 V3178 V3179 V3180 V3181 V3182 V3183 V3184
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3185 V3186 V3187 V3188 V3189 V3190 V3191 V3192 V3193 V3194 V3195 V3196
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3197 V3198 V3199 V3200 V3201 V3202 V3203 V3204 V3205 V3206 V3207 V3208
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3209 V3210 V3211 V3212 V3213 V3214 V3215 V3216 V3217 V3218 V3219 V3220
## 48  TRUE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V3221 V3222 V3223 V3224 V3225 V3226 V3227 V3228 V3229 V3230 V3231 V3232
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V3233 V3234 V3235 V3236 V3237 V3238 V3239 V3240 V3241 V3242 V3243 V3244
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V3245 V3246 V3247 V3248 V3249 V3250 V3251 V3252 V3253 V3254 V3255 V3256
## 48 FALSE FALSE FALSE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3257 V3258 V3259 V3260 V3261 V3262 V3263 V3264 V3265 V3266 V3267 V3268
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3269 V3270 V3271 V3272 V3273 V3274 V3275 V3276 V3277 V3278 V3279 V3280
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3281 V3282 V3283 V3284 V3285 V3286 V3287 V3288 V3289 V3290 V3291 V3292
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3293 V3294 V3295 V3296 V3297 V3298 V3299 V3300 V3301 V3302 V3303 V3304
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE FALSE FALSE FALSE
##    V3305 V3306 V3307 V3308 V3309 V3310 V3311 V3312 V3313 V3314 V3315 V3316
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V3317 V3318 V3319 V3320 V3321 V3322 V3323 V3324 V3325 V3326 V3327 V3328
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V3329 V3330 V3331 V3332 V3333 V3334 V3335 V3336 V3337 V3338 V3339 V3340
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE  TRUE  TRUE
##    V3341 V3342 V3343 V3344 V3345 V3346 V3347 V3348 V3349 V3350 V3351 V3352
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3353 V3354 V3355 V3356 V3357 V3358 V3359 V3360 V3361 V3362 V3363 V3364
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3365 V3366 V3367 V3368 V3369 V3370 V3371 V3372 V3373 V3374 V3375 V3376
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3377 V3378 V3379 V3380 V3381 V3382 V3383 V3384 V3385 V3386 V3387 V3388
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3389 V3390 V3391 V3392 V3393 V3394 V3395 V3396 V3397 V3398 V3399 V3400
## 48  TRUE  TRUE  TRUE  TRUE  TRUE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V3401 V3402 V3403 V3404 V3405 V3406 V3407 V3408 V3409 V3410 V3411 V3412
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V3413 V3414 V3415 V3416 V3417 V3418 V3419 V3420 V3421 V3422 V3423 V3424
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V3425 V3426 V3427 V3428 V3429 V3430 V3431 V3432 V3433 V3434 V3435 V3436
## 48 FALSE FALSE FALSE FALSE FALSE FALSE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3437 V3438 V3439 V3440 V3441 V3442 V3443 V3444 V3445 V3446 V3447 V3448
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3449 V3450 V3451 V3452 V3453 V3454 V3455 V3456 V3457 V3458 V3459 V3460
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3461 V3462 V3463 V3464 V3465 V3466 V3467 V3468 V3469 V3470 V3471 V3472
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3473 V3474 V3475 V3476 V3477 V3478 V3479 V3480 V3481 V3482 V3483 V3484
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3485 V3486 V3487 V3488 V3489 V3490 V3491 V3492 V3493 V3494 V3495 V3496
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V3497 V3498 V3499 V3500 V3501 V3502 V3503 V3504 V3505 V3506 V3507 V3508
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V3509 V3510 V3511 V3512 V3513 V3514 V3515 V3516 V3517 V3518 V3519 V3520
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V3521 V3522 V3523 V3524 V3525 V3526 V3527 V3528 V3529 V3530 V3531 V3532
## 48 FALSE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3533 V3534 V3535 V3536 V3537 V3538 V3539 V3540 V3541 V3542 V3543 V3544
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3545 V3546 V3547 V3548 V3549 V3550 V3551 V3552 V3553 V3554 V3555 V3556
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3557 V3558 V3559 V3560 V3561 V3562 V3563 V3564 V3565 V3566 V3567 V3568
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3569 V3570 V3571 V3572 V3573 V3574 V3575 V3576 V3577 V3578 V3579 V3580
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE FALSE FALSE FALSE FALSE
##    V3581 V3582 V3583 V3584 V3585 V3586 V3587 V3588 V3589 V3590 V3591 V3592
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V3593 V3594 V3595 V3596 V3597 V3598 V3599 V3600 V3601 V3602 V3603 V3604
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V3605 V3606 V3607 V3608 V3609 V3610 V3611 V3612 V3613 V3614 V3615 V3616
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3617 V3618 V3619 V3620 V3621 V3622 V3623 V3624 V3625 V3626 V3627 V3628
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3629 V3630 V3631 V3632 V3633 V3634 V3635 V3636 V3637 V3638 V3639 V3640
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3641 V3642 V3643 V3644 V3645 V3646 V3647 V3648 V3649 V3650 V3651 V3652
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3653 V3654 V3655 V3656 V3657 V3658 V3659 V3660 V3661 V3662 V3663 V3664
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3665 V3666 V3667 V3668 V3669 V3670 V3671 V3672 V3673 V3674 V3675 V3676
## 48  TRUE  TRUE  TRUE  TRUE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V3677 V3678 V3679 V3680 V3681 V3682 V3683 V3684 V3685 V3686 V3687 V3688
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V3689 V3690 V3691 V3692 V3693 V3694 V3695 V3696 V3697 V3698 V3699 V3700
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V3701 V3702 V3703 V3704 V3705 V3706 V3707 V3708 V3709 V3710 V3711 V3712
## 48 FALSE FALSE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3713 V3714 V3715 V3716 V3717 V3718 V3719 V3720 V3721 V3722 V3723 V3724
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3725 V3726 V3727 V3728 V3729 V3730 V3731 V3732 V3733 V3734 V3735 V3736
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3737 V3738 V3739 V3740 V3741 V3742 V3743 V3744 V3745 V3746 V3747 V3748
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3749 V3750 V3751 V3752 V3753 V3754 V3755 V3756 V3757 V3758 V3759 V3760
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3761 V3762 V3763 V3764 V3765 V3766 V3767 V3768 V3769 V3770 V3771 V3772
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V3773 V3774 V3775 V3776 V3777 V3778 V3779 V3780 V3781 V3782 V3783 V3784
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V3785 V3786 V3787 V3788 V3789 V3790 V3791 V3792 V3793 V3794 V3795 V3796
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE  TRUE  TRUE  TRUE
##    V3797 V3798 V3799 V3800 V3801 V3802 V3803 V3804 V3805 V3806 V3807 V3808
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3809 V3810 V3811 V3812 V3813 V3814 V3815 V3816 V3817 V3818 V3819 V3820
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3821 V3822 V3823 V3824 V3825 V3826 V3827 V3828 V3829 V3830 V3831 V3832
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3833 V3834 V3835 V3836 V3837 V3838 V3839 V3840 V3841 V3842 V3843 V3844
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3845 V3846 V3847 V3848 V3849 V3850 V3851 V3852 V3853 V3854 V3855 V3856
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE FALSE FALSE FALSE FALSE
##    V3857 V3858 V3859 V3860 V3861 V3862 V3863 V3864 V3865 V3866 V3867 V3868
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V3869 V3870 V3871 V3872 V3873 V3874 V3875 V3876 V3877 V3878 V3879 V3880
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V3881 V3882 V3883 V3884 V3885 V3886 V3887 V3888 V3889 V3890 V3891 V3892
## 48 FALSE FALSE FALSE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3893 V3894 V3895 V3896 V3897 V3898 V3899 V3900 V3901 V3902 V3903 V3904
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3905 V3906 V3907 V3908 V3909 V3910 V3911 V3912 V3913 V3914 V3915 V3916
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3917 V3918 V3919 V3920 V3921 V3922 V3923 V3924 V3925 V3926 V3927 V3928
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3929 V3930 V3931 V3932 V3933 V3934 V3935 V3936 V3937 V3938 V3939 V3940
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3941 V3942 V3943 V3944 V3945 V3946 V3947 V3948 V3949 V3950 V3951 V3952
## 48  TRUE  TRUE  TRUE  TRUE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V3953 V3954 V3955 V3956 V3957 V3958 V3959 V3960 V3961 V3962 V3963 V3964
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V3965 V3966 V3967 V3968 V3969 V3970 V3971 V3972 V3973 V3974 V3975 V3976
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE  TRUE  TRUE
##    V3977 V3978 V3979 V3980 V3981 V3982 V3983 V3984 V3985 V3986 V3987 V3988
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V3989 V3990 V3991 V3992 V3993 V3994 V3995 V3996 V3997 V3998 V3999 V4000
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4001 V4002 V4003 V4004 V4005 V4006 V4007 V4008 V4009 V4010 V4011 V4012
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4013 V4014 V4015 V4016 V4017 V4018 V4019 V4020 V4021 V4022 V4023 V4024
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4025 V4026 V4027 V4028 V4029 V4030 V4031 V4032 V4033 V4034 V4035 V4036
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE FALSE
##    V4037 V4038 V4039 V4040 V4041 V4042 V4043 V4044 V4045 V4046 V4047 V4048
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V4049 V4050 V4051 V4052 V4053 V4054 V4055 V4056 V4057 V4058 V4059 V4060
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V4061 V4062 V4063 V4064 V4065 V4066 V4067 V4068 V4069 V4070 V4071 V4072
## 48 FALSE FALSE FALSE FALSE FALSE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4073 V4074 V4075 V4076 V4077 V4078 V4079 V4080 V4081 V4082 V4083 V4084
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4085 V4086 V4087 V4088 V4089 V4090 V4091 V4092 V4093 V4094 V4095 V4096
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4097 V4098 V4099 V4100 V4101 V4102 V4103 V4104 V4105 V4106 V4107 V4108
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4109 V4110 V4111 V4112 V4113 V4114 V4115 V4116 V4117 V4118 V4119 V4120
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4121 V4122 V4123 V4124 V4125 V4126 V4127 V4128 V4129 V4130 V4131 V4132
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE FALSE FALSE FALSE FALSE FALSE FALSE
##    V4133 V4134 V4135 V4136 V4137 V4138 V4139 V4140 V4141 V4142 V4143 V4144
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V4145 V4146 V4147 V4148 V4149 V4150 V4151 V4152 V4153 V4154 V4155 V4156
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V4157 V4158 V4159 V4160 V4161 V4162 V4163 V4164 V4165 V4166 V4167 V4168
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4169 V4170 V4171 V4172 V4173 V4174 V4175 V4176 V4177 V4178 V4179 V4180
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4181 V4182 V4183 V4184 V4185 V4186 V4187 V4188 V4189 V4190 V4191 V4192
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4193 V4194 V4195 V4196 V4197 V4198 V4199 V4200 V4201 V4202 V4203 V4204
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4205 V4206 V4207 V4208 V4209 V4210 V4211 V4212 V4213 V4214 V4215 V4216
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE FALSE FALSE
##    V4217 V4218 V4219 V4220 V4221 V4222 V4223 V4224 V4225 V4226 V4227 V4228
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V4229 V4230 V4231 V4232 V4233 V4234 V4235 V4236 V4237 V4238 V4239 V4240
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V4241 V4242 V4243 V4244 V4245 V4246 V4247 V4248 V4249 V4250 V4251 V4252
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE  TRUE  TRUE  TRUE  TRUE
##    V4253 V4254 V4255 V4256 V4257 V4258 V4259 V4260 V4261 V4262 V4263 V4264
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4265 V4266 V4267 V4268 V4269 V4270 V4271 V4272 V4273 V4274 V4275 V4276
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4277 V4278 V4279 V4280 V4281 V4282 V4283 V4284 V4285 V4286 V4287 V4288
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4289 V4290 V4291 V4292 V4293 V4294 V4295 V4296 V4297 V4298 V4299 V4300
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4301 V4302 V4303 V4304 V4305 V4306 V4307 V4308 V4309 V4310 V4311 V4312
## 48  TRUE  TRUE  TRUE  TRUE  TRUE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V4313 V4314 V4315 V4316 V4317 V4318 V4319 V4320 V4321 V4322 V4323 V4324
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V4325 V4326 V4327 V4328 V4329 V4330 V4331 V4332 V4333 V4334 V4335 V4336
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V4337 V4338 V4339 V4340 V4341 V4342 V4343 V4344 V4345 V4346 V4347 V4348
## 48 FALSE FALSE FALSE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4349 V4350 V4351 V4352 V4353 V4354 V4355 V4356 V4357 V4358 V4359 V4360
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4361 V4362 V4363 V4364 V4365 V4366 V4367 V4368 V4369 V4370 V4371 V4372
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4373 V4374 V4375 V4376 V4377 V4378 V4379 V4380 V4381 V4382 V4383 V4384
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4385 V4386 V4387 V4388 V4389 V4390 V4391 V4392 V4393 V4394 V4395 V4396
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4397 V4398 V4399 V4400 V4401 V4402 V4403 V4404 V4405 V4406 V4407 V4408
## 48  TRUE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V4409 V4410 V4411 V4412 V4413 V4414 V4415 V4416 V4417 V4418 V4419 V4420
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V4421 V4422 V4423 V4424 V4425 V4426 V4427 V4428 V4429 V4430 V4431 V4432
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE  TRUE  TRUE
##    V4433 V4434 V4435 V4436 V4437 V4438 V4439 V4440 V4441 V4442 V4443 V4444
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4445 V4446 V4447 V4448 V4449 V4450 V4451 V4452 V4453 V4454 V4455 V4456
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4457 V4458 V4459 V4460 V4461 V4462 V4463 V4464 V4465 V4466 V4467 V4468
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4469 V4470 V4471 V4472 V4473 V4474 V4475 V4476 V4477 V4478 V4479 V4480
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4481 V4482 V4483 V4484 V4485 V4486 V4487 V4488 V4489 V4490 V4491 V4492
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE FALSE FALSE FALSE FALSE
##    V4493 V4494 V4495 V4496 V4497 V4498 V4499 V4500 V4501 V4502 V4503 V4504
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V4505 V4506 V4507 V4508 V4509 V4510 V4511 V4512 V4513 V4514 V4515 V4516
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V4517 V4518 V4519 V4520 V4521 V4522 V4523 V4524 V4525 V4526 V4527 V4528
## 48 FALSE FALSE FALSE FALSE FALSE FALSE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4529 V4530 V4531 V4532 V4533 V4534 V4535 V4536 V4537 V4538 V4539 V4540
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4541 V4542 V4543 V4544 V4545 V4546 V4547 V4548 V4549 V4550 V4551 V4552
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4553 V4554 V4555 V4556 V4557 V4558 V4559 V4560 V4561 V4562 V4563 V4564
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4565 V4566 V4567 V4568 V4569 V4570 V4571 V4572 V4573 V4574 V4575 V4576
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4577 V4578 V4579 V4580 V4581 V4582 V4583 V4584 V4585 V4586 V4587 V4588
## 48  TRUE  TRUE  TRUE  TRUE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V4589 V4590 V4591 V4592 V4593 V4594 V4595 V4596 V4597 V4598 V4599 V4600
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V4601 V4602 V4603 V4604 V4605 V4606 V4607 V4608 V4609 V4610 V4611 V4612
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V4613 V4614 V4615 V4616 V4617 V4618 V4619 V4620 V4621 V4622 V4623 V4624
## 48 FALSE FALSE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4625 V4626 V4627 V4628 V4629 V4630 V4631 V4632 V4633 V4634 V4635 V4636
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4637 V4638 V4639 V4640 V4641 V4642 V4643 V4644 V4645 V4646 V4647 V4648
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4649 V4650 V4651 V4652 V4653 V4654 V4655 V4656 V4657 V4658 V4659 V4660
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4661 V4662 V4663 V4664 V4665 V4666 V4667 V4668 V4669 V4670 V4671 V4672
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4673 V4674 V4675 V4676 V4677 V4678 V4679 V4680 V4681 V4682 V4683 V4684
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V4685 V4686 V4687 V4688 V4689 V4690 V4691 V4692 V4693 V4694 V4695 V4696
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V4697 V4698 V4699 V4700 V4701 V4702 V4703 V4704 V4705 V4706 V4707 V4708
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE  TRUE  TRUE  TRUE
##    V4709 V4710 V4711 V4712 V4713 V4714 V4715 V4716 V4717 V4718 V4719 V4720
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4721 V4722 V4723 V4724 V4725 V4726 V4727 V4728 V4729 V4730 V4731 V4732
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4733 V4734 V4735 V4736 V4737 V4738 V4739 V4740 V4741 V4742 V4743 V4744
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4745 V4746 V4747 V4748 V4749 V4750 V4751 V4752 V4753 V4754 V4755 V4756
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4757 V4758 V4759 V4760 V4761 V4762 V4763 V4764 V4765 V4766 V4767 V4768
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE FALSE FALSE FALSE FALSE FALSE
##    V4769 V4770 V4771 V4772 V4773 V4774 V4775 V4776 V4777 V4778 V4779 V4780
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V4781 V4782 V4783 V4784 V4785 V4786 V4787 V4788 V4789 V4790 V4791 V4792
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V4793 V4794 V4795 V4796 V4797 V4798 V4799 V4800 V4801 V4802 V4803 V4804
## 48 FALSE FALSE FALSE FALSE FALSE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4805 V4806 V4807 V4808 V4809 V4810 V4811 V4812 V4813 V4814 V4815 V4816
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4817 V4818 V4819 V4820 V4821 V4822 V4823 V4824 V4825 V4826 V4827 V4828
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4829 V4830 V4831 V4832 V4833 V4834 V4835 V4836 V4837 V4838 V4839 V4840
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4841 V4842 V4843 V4844 V4845 V4846 V4847 V4848 V4849 V4850 V4851 V4852
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4853 V4854 V4855 V4856 V4857 V4858 V4859 V4860 V4861 V4862 V4863 V4864
## 48  TRUE  TRUE  TRUE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V4865 V4866 V4867 V4868 V4869 V4870 V4871 V4872 V4873 V4874 V4875 V4876
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V4877 V4878 V4879 V4880 V4881 V4882 V4883 V4884 V4885 V4886 V4887 V4888
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V4889 V4890 V4891 V4892 V4893 V4894 V4895 V4896 V4897 V4898 V4899 V4900
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4901 V4902 V4903 V4904 V4905 V4906 V4907 V4908 V4909 V4910 V4911 V4912
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4913 V4914 V4915 V4916 V4917 V4918 V4919 V4920 V4921 V4922 V4923 V4924
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4925 V4926 V4927 V4928 V4929 V4930 V4931 V4932 V4933 V4934 V4935 V4936
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4937 V4938 V4939 V4940 V4941 V4942 V4943 V4944 V4945 V4946 V4947 V4948
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE FALSE
##    V4949 V4950 V4951 V4952 V4953 V4954 V4955 V4956 V4957 V4958 V4959 V4960
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V4961 V4962 V4963 V4964 V4965 V4966 V4967 V4968 V4969 V4970 V4971 V4972
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V4973 V4974 V4975 V4976 V4977 V4978 V4979 V4980 V4981 V4982 V4983 V4984
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE  TRUE  TRUE  TRUE  TRUE
##    V4985 V4986 V4987 V4988 V4989 V4990 V4991 V4992 V4993 V4994 V4995 V4996
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V4997 V4998 V4999 V5000 V5001 V5002 V5003 V5004 V5005 V5006 V5007 V5008
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5009 V5010 V5011 V5012 V5013 V5014 V5015 V5016 V5017 V5018 V5019 V5020
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5021 V5022 V5023 V5024 V5025 V5026 V5027 V5028 V5029 V5030 V5031 V5032
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5033 V5034 V5035 V5036 V5037 V5038 V5039 V5040 V5041 V5042 V5043 V5044
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE FALSE FALSE FALSE FALSE FALSE
##    V5045 V5046 V5047 V5048 V5049 V5050 V5051 V5052 V5053 V5054 V5055 V5056
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V5057 V5058 V5059 V5060 V5061 V5062 V5063 V5064 V5065 V5066 V5067 V5068
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V5069 V5070 V5071 V5072 V5073 V5074 V5075 V5076 V5077 V5078 V5079 V5080
## 48 FALSE FALSE FALSE FALSE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5081 V5082 V5083 V5084 V5085 V5086 V5087 V5088 V5089 V5090 V5091 V5092
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5093 V5094 V5095 V5096 V5097 V5098 V5099 V5100 V5101 V5102 V5103 V5104
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5105 V5106 V5107 V5108 V5109 V5110 V5111 V5112 V5113 V5114 V5115 V5116
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5117 V5118 V5119 V5120 V5121 V5122 V5123 V5124 V5125 V5126 V5127 V5128
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5129 V5130 V5131 V5132 V5133 V5134 V5135 V5136 V5137 V5138 V5139 V5140
## 48  TRUE  TRUE  TRUE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V5141 V5142 V5143 V5144 V5145 V5146 V5147 V5148 V5149 V5150 V5151 V5152
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V5153 V5154 V5155 V5156 V5157 V5158 V5159 V5160 V5161 V5162 V5163 V5164
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE  TRUE
##    V5165 V5166 V5167 V5168 V5169 V5170 V5171 V5172 V5173 V5174 V5175 V5176
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5177 V5178 V5179 V5180 V5181 V5182 V5183 V5184 V5185 V5186 V5187 V5188
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5189 V5190 V5191 V5192 V5193 V5194 V5195 V5196 V5197 V5198 V5199 V5200
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5201 V5202 V5203 V5204 V5205 V5206 V5207 V5208 V5209 V5210 V5211 V5212
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5213 V5214 V5215 V5216 V5217 V5218 V5219 V5220 V5221 V5222 V5223 V5224
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE FALSE
##    V5225 V5226 V5227 V5228 V5229 V5230 V5231 V5232 V5233 V5234 V5235 V5236
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V5237 V5238 V5239 V5240 V5241 V5242 V5243 V5244 V5245 V5246 V5247 V5248
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V5249 V5250 V5251 V5252 V5253 V5254 V5255 V5256 V5257 V5258 V5259 V5260
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5261 V5262 V5263 V5264 V5265 V5266 V5267 V5268 V5269 V5270 V5271 V5272
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5273 V5274 V5275 V5276 V5277 V5278 V5279 V5280 V5281 V5282 V5283 V5284
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5285 V5286 V5287 V5288 V5289 V5290 V5291 V5292 V5293 V5294 V5295 V5296
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5297 V5298 V5299 V5300 V5301 V5302 V5303 V5304 V5305 V5306 V5307 V5308
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5309 V5310 V5311 V5312 V5313 V5314 V5315 V5316 V5317 V5318 V5319 V5320
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE FALSE FALSE FALSE FALSE FALSE
##    V5321 V5322 V5323 V5324 V5325 V5326 V5327 V5328 V5329 V5330 V5331 V5332
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V5333 V5334 V5335 V5336 V5337 V5338 V5339 V5340 V5341 V5342 V5343 V5344
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V5345 V5346 V5347 V5348 V5349 V5350 V5351 V5352 V5353 V5354 V5355 V5356
## 48 FALSE FALSE FALSE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5357 V5358 V5359 V5360 V5361 V5362 V5363 V5364 V5365 V5366 V5367 V5368
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5369 V5370 V5371 V5372 V5373 V5374 V5375 V5376 V5377 V5378 V5379 V5380
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5381 V5382 V5383 V5384 V5385 V5386 V5387 V5388 V5389 V5390 V5391 V5392
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5393 V5394 V5395 V5396 V5397 V5398 V5399 V5400 V5401 V5402 V5403 V5404
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5405 V5406 V5407 V5408 V5409 V5410 V5411 V5412 V5413 V5414 V5415 V5416
## 48  TRUE  TRUE  TRUE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V5417 V5418 V5419 V5420 V5421 V5422 V5423 V5424 V5425 V5426 V5427 V5428
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V5429 V5430 V5431 V5432 V5433 V5434 V5435 V5436 V5437 V5438 V5439 V5440
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE  TRUE
##    V5441 V5442 V5443 V5444 V5445 V5446 V5447 V5448 V5449 V5450 V5451 V5452
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5453 V5454 V5455 V5456 V5457 V5458 V5459 V5460 V5461 V5462 V5463 V5464
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5465 V5466 V5467 V5468 V5469 V5470 V5471 V5472 V5473 V5474 V5475 V5476
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5477 V5478 V5479 V5480 V5481 V5482 V5483 V5484 V5485 V5486 V5487 V5488
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5489 V5490 V5491 V5492 V5493 V5494 V5495 V5496 V5497 V5498 V5499 V5500
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE FALSE
##    V5501 V5502 V5503 V5504 V5505 V5506 V5507 V5508 V5509 V5510 V5511 V5512
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V5513 V5514 V5515 V5516 V5517 V5518 V5519 V5520 V5521 V5522 V5523 V5524
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V5525 V5526 V5527 V5528 V5529 V5530 V5531 V5532 V5533 V5534 V5535 V5536
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5537 V5538 V5539 V5540 V5541 V5542 V5543 V5544 V5545 V5546 V5547 V5548
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5549 V5550 V5551 V5552 V5553 V5554 V5555 V5556 V5557 V5558 V5559 V5560
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5561 V5562 V5563 V5564 V5565 V5566 V5567 V5568 V5569 V5570 V5571 V5572
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5573 V5574 V5575 V5576 V5577 V5578 V5579 V5580 V5581 V5582 V5583 V5584
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5585 V5586 V5587 V5588 V5589 V5590 V5591 V5592 V5593 V5594 V5595 V5596
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE FALSE FALSE FALSE FALSE FALSE
##    V5597 V5598 V5599 V5600 V5601 V5602 V5603 V5604 V5605 V5606 V5607 V5608
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V5609 V5610 V5611 V5612 V5613 V5614 V5615 V5616 V5617 V5618 V5619 V5620
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V5621 V5622 V5623 V5624 V5625 V5626 V5627 V5628 V5629 V5630 V5631 V5632
## 48 FALSE FALSE FALSE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5633 V5634 V5635 V5636 V5637 V5638 V5639 V5640 V5641 V5642 V5643 V5644
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5645 V5646 V5647 V5648 V5649 V5650 V5651 V5652 V5653 V5654 V5655 V5656
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5657 V5658 V5659 V5660 V5661 V5662 V5663 V5664 V5665 V5666 V5667 V5668
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5669 V5670 V5671 V5672 V5673 V5674 V5675 V5676 V5677 V5678 V5679 V5680
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5681 V5682 V5683 V5684 V5685 V5686 V5687 V5688 V5689 V5690 V5691 V5692
## 48  TRUE  TRUE  TRUE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V5693 V5694 V5695 V5696 V5697 V5698 V5699 V5700 V5701 V5702 V5703 V5704
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V5705 V5706 V5707 V5708 V5709 V5710 V5711 V5712 V5713 V5714 V5715 V5716
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE  TRUE
##    V5717 V5718 V5719 V5720 V5721 V5722 V5723 V5724 V5725 V5726 V5727 V5728
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5729 V5730 V5731 V5732 V5733 V5734 V5735 V5736 V5737 V5738 V5739 V5740
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5741 V5742 V5743 V5744 V5745 V5746 V5747 V5748 V5749 V5750 V5751 V5752
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5753 V5754 V5755 V5756 V5757 V5758 V5759 V5760 V5761 V5762 V5763 V5764
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5765 V5766 V5767 V5768 V5769 V5770 V5771 V5772 V5773 V5774 V5775 V5776
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE FALSE
##    V5777 V5778 V5779 V5780 V5781 V5782 V5783 V5784 V5785 V5786 V5787 V5788
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V5789 V5790 V5791 V5792 V5793 V5794 V5795 V5796 V5797 V5798 V5799 V5800
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V5801 V5802 V5803 V5804 V5805 V5806 V5807 V5808 V5809 V5810 V5811 V5812
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5813 V5814 V5815 V5816 V5817 V5818 V5819 V5820 V5821 V5822 V5823 V5824
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5825 V5826 V5827 V5828 V5829 V5830 V5831 V5832 V5833 V5834 V5835 V5836
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5837 V5838 V5839 V5840 V5841 V5842 V5843 V5844 V5845 V5846 V5847 V5848
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5849 V5850 V5851 V5852 V5853 V5854 V5855 V5856 V5857 V5858 V5859 V5860
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5861 V5862 V5863 V5864 V5865 V5866 V5867 V5868 V5869 V5870 V5871 V5872
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE FALSE FALSE FALSE FALSE FALSE
##    V5873 V5874 V5875 V5876 V5877 V5878 V5879 V5880 V5881 V5882 V5883 V5884
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V5885 V5886 V5887 V5888 V5889 V5890 V5891 V5892 V5893 V5894 V5895 V5896
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V5897 V5898 V5899 V5900 V5901 V5902 V5903 V5904 V5905 V5906 V5907 V5908
## 48 FALSE FALSE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5909 V5910 V5911 V5912 V5913 V5914 V5915 V5916 V5917 V5918 V5919 V5920
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5921 V5922 V5923 V5924 V5925 V5926 V5927 V5928 V5929 V5930 V5931 V5932
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5933 V5934 V5935 V5936 V5937 V5938 V5939 V5940 V5941 V5942 V5943 V5944
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5945 V5946 V5947 V5948 V5949 V5950 V5951 V5952 V5953 V5954 V5955 V5956
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V5957 V5958 V5959 V5960 V5961 V5962 V5963 V5964 V5965 V5966 V5967 V5968
## 48  TRUE  TRUE  TRUE  TRUE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V5969 V5970 V5971 V5972 V5973 V5974 V5975 V5976 V5977 V5978 V5979 V5980
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V5981 V5982 V5983 V5984 V5985 V5986 V5987 V5988 V5989 V5990 V5991 V5992
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE  TRUE  TRUE
##    V5993 V5994 V5995 V5996 V5997 V5998 V5999 V6000 V6001 V6002 V6003 V6004
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6005 V6006 V6007 V6008 V6009 V6010 V6011 V6012 V6013 V6014 V6015 V6016
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6017 V6018 V6019 V6020 V6021 V6022 V6023 V6024 V6025 V6026 V6027 V6028
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6029 V6030 V6031 V6032 V6033 V6034 V6035 V6036 V6037 V6038 V6039 V6040
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6041 V6042 V6043 V6044 V6045 V6046 V6047 V6048 V6049 V6050 V6051 V6052
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6053 V6054 V6055 V6056 V6057 V6058 V6059 V6060 V6061 V6062 V6063 V6064
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V6065 V6066 V6067 V6068 V6069 V6070 V6071 V6072 V6073 V6074 V6075 V6076
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V6077 V6078 V6079 V6080 V6081 V6082 V6083 V6084 V6085 V6086 V6087 V6088
## 48 FALSE FALSE FALSE FALSE FALSE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6089 V6090 V6091 V6092 V6093 V6094 V6095 V6096 V6097 V6098 V6099 V6100
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6101 V6102 V6103 V6104 V6105 V6106 V6107 V6108 V6109 V6110 V6111 V6112
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6113 V6114 V6115 V6116 V6117 V6118 V6119 V6120 V6121 V6122 V6123 V6124
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6125 V6126 V6127 V6128 V6129 V6130 V6131 V6132 V6133 V6134 V6135 V6136
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6137 V6138 V6139 V6140 V6141 V6142 V6143 V6144 V6145 V6146 V6147 V6148
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE FALSE FALSE FALSE FALSE
##    V6149 V6150 V6151 V6152 V6153 V6154 V6155 V6156 V6157 V6158 V6159 V6160
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V6161 V6162 V6163 V6164 V6165 V6166 V6167 V6168 V6169 V6170 V6171 V6172
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V6173 V6174 V6175 V6176 V6177 V6178 V6179 V6180 V6181 V6182 V6183 V6184
## 48 FALSE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6185 V6186 V6187 V6188 V6189 V6190 V6191 V6192 V6193 V6194 V6195 V6196
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6197 V6198 V6199 V6200 V6201 V6202 V6203 V6204 V6205 V6206 V6207 V6208
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6209 V6210 V6211 V6212 V6213 V6214 V6215 V6216 V6217 V6218 V6219 V6220
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6221 V6222 V6223 V6224 V6225 V6226 V6227 V6228 V6229 V6230 V6231 V6232
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6233 V6234 V6235 V6236 V6237 V6238 V6239 V6240 V6241 V6242 V6243 V6244
## 48  TRUE  TRUE  TRUE  TRUE  TRUE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V6245 V6246 V6247 V6248 V6249 V6250 V6251 V6252 V6253 V6254 V6255 V6256
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V6257 V6258 V6259 V6260 V6261 V6262 V6263 V6264 V6265 V6266 V6267 V6268
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE  TRUE  TRUE  TRUE
##    V6269 V6270 V6271 V6272 V6273 V6274 V6275 V6276 V6277 V6278 V6279 V6280
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6281 V6282 V6283 V6284 V6285 V6286 V6287 V6288 V6289 V6290 V6291 V6292
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6293 V6294 V6295 V6296 V6297 V6298 V6299 V6300 V6301 V6302 V6303 V6304
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6305 V6306 V6307 V6308 V6309 V6310 V6311 V6312 V6313 V6314 V6315 V6316
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6317 V6318 V6319 V6320 V6321 V6322 V6323 V6324 V6325 V6326 V6327 V6328
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6329 V6330 V6331 V6332 V6333 V6334 V6335 V6336 V6337 V6338 V6339 V6340
## 48  TRUE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V6341 V6342 V6343 V6344 V6345 V6346 V6347 V6348 V6349 V6350 V6351 V6352
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V6353 V6354 V6355 V6356 V6357 V6358 V6359 V6360 V6361 V6362 V6363 V6364
## 48 FALSE FALSE FALSE FALSE FALSE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6365 V6366 V6367 V6368 V6369 V6370 V6371 V6372 V6373 V6374 V6375 V6376
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6377 V6378 V6379 V6380 V6381 V6382 V6383 V6384 V6385 V6386 V6387 V6388
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6389 V6390 V6391 V6392 V6393 V6394 V6395 V6396 V6397 V6398 V6399 V6400
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6401 V6402 V6403 V6404 V6405 V6406 V6407 V6408 V6409 V6410 V6411 V6412
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6413 V6414 V6415 V6416 V6417 V6418 V6419 V6420 V6421 V6422 V6423 V6424
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE FALSE FALSE FALSE FALSE
##    V6425 V6426 V6427 V6428 V6429 V6430 V6431 V6432 V6433 V6434 V6435 V6436
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V6437 V6438 V6439 V6440 V6441 V6442 V6443 V6444 V6445 V6446 V6447 V6448
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V6449 V6450 V6451 V6452 V6453 V6454 V6455 V6456 V6457 V6458 V6459 V6460
## 48 FALSE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6461 V6462 V6463 V6464 V6465 V6466 V6467 V6468 V6469 V6470 V6471 V6472
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6473 V6474 V6475 V6476 V6477 V6478 V6479 V6480 V6481 V6482 V6483 V6484
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6485 V6486 V6487 V6488 V6489 V6490 V6491 V6492 V6493 V6494 V6495 V6496
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6497 V6498 V6499 V6500 V6501 V6502 V6503 V6504 V6505 V6506 V6507 V6508
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6509 V6510 V6511 V6512 V6513 V6514 V6515 V6516 V6517 V6518 V6519 V6520
## 48  TRUE  TRUE  TRUE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V6521 V6522 V6523 V6524 V6525 V6526 V6527 V6528 V6529 V6530 V6531 V6532
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V6533 V6534 V6535 V6536 V6537 V6538 V6539 V6540 V6541 V6542 V6543 V6544
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE  TRUE  TRUE  TRUE
##    V6545 V6546 V6547 V6548 V6549 V6550 V6551 V6552 V6553 V6554 V6555 V6556
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6557 V6558 V6559 V6560 V6561 V6562 V6563 V6564 V6565 V6566 V6567 V6568
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6569 V6570 V6571 V6572 V6573 V6574 V6575 V6576 V6577 V6578 V6579 V6580
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6581 V6582 V6583 V6584 V6585 V6586 V6587 V6588 V6589 V6590 V6591 V6592
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6593 V6594 V6595 V6596 V6597 V6598 V6599 V6600 V6601 V6602 V6603 V6604
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE FALSE
##    V6605 V6606 V6607 V6608 V6609 V6610 V6611 V6612 V6613 V6614 V6615 V6616
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V6617 V6618 V6619 V6620 V6621 V6622 V6623 V6624 V6625 V6626 V6627 V6628
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V6629 V6630 V6631 V6632 V6633 V6634 V6635 V6636 V6637 V6638 V6639 V6640
## 48 FALSE FALSE FALSE FALSE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6641 V6642 V6643 V6644 V6645 V6646 V6647 V6648 V6649 V6650 V6651 V6652
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6653 V6654 V6655 V6656 V6657 V6658 V6659 V6660 V6661 V6662 V6663 V6664
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6665 V6666 V6667 V6668 V6669 V6670 V6671 V6672 V6673 V6674 V6675 V6676
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6677 V6678 V6679 V6680 V6681 V6682 V6683 V6684 V6685 V6686 V6687 V6688
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6689 V6690 V6691 V6692 V6693 V6694 V6695 V6696 V6697 V6698 V6699 V6700
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE FALSE FALSE FALSE FALSE FALSE FALSE
##    V6701 V6702 V6703 V6704 V6705 V6706 V6707 V6708 V6709 V6710 V6711 V6712
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V6713 V6714 V6715 V6716 V6717 V6718 V6719 V6720 V6721 V6722 V6723 V6724
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE  TRUE
##    V6725 V6726 V6727 V6728 V6729 V6730 V6731 V6732 V6733 V6734 V6735 V6736
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6737 V6738 V6739 V6740 V6741 V6742 V6743 V6744 V6745 V6746 V6747 V6748
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6749 V6750 V6751 V6752 V6753 V6754 V6755 V6756 V6757 V6758 V6759 V6760
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6761 V6762 V6763 V6764 V6765 V6766 V6767 V6768 V6769 V6770 V6771 V6772
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6773 V6774 V6775 V6776 V6777 V6778 V6779 V6780 V6781 V6782 V6783 V6784
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6785 V6786 V6787 V6788 V6789 V6790 V6791 V6792 V6793 V6794 V6795 V6796
## 48  TRUE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V6797 V6798 V6799 V6800 V6801 V6802 V6803 V6804 V6805 V6806 V6807 V6808
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V6809 V6810 V6811 V6812 V6813 V6814 V6815 V6816 V6817 V6818 V6819 V6820
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6821 V6822 V6823 V6824 V6825 V6826 V6827 V6828 V6829 V6830 V6831 V6832
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6833 V6834 V6835 V6836 V6837 V6838 V6839 V6840 V6841 V6842 V6843 V6844
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6845 V6846 V6847 V6848 V6849 V6850 V6851 V6852 V6853 V6854 V6855 V6856
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6857 V6858 V6859 V6860 V6861 V6862 V6863 V6864 V6865 V6866 V6867 V6868
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6869 V6870 V6871 V6872 V6873 V6874 V6875 V6876 V6877 V6878 V6879 V6880
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE FALSE FALSE FALSE FALSE
##    V6881 V6882 V6883 V6884 V6885 V6886 V6887 V6888 V6889 V6890 V6891 V6892
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V6893 V6894 V6895 V6896 V6897 V6898 V6899 V6900 V6901 V6902 V6903 V6904
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V6905 V6906 V6907 V6908 V6909 V6910 V6911 V6912 V6913 V6914 V6915 V6916
## 48 FALSE FALSE FALSE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6917 V6918 V6919 V6920 V6921 V6922 V6923 V6924 V6925 V6926 V6927 V6928
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6929 V6930 V6931 V6932 V6933 V6934 V6935 V6936 V6937 V6938 V6939 V6940
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6941 V6942 V6943 V6944 V6945 V6946 V6947 V6948 V6949 V6950 V6951 V6952
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6953 V6954 V6955 V6956 V6957 V6958 V6959 V6960 V6961 V6962 V6963 V6964
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V6965 V6966 V6967 V6968 V6969 V6970 V6971 V6972 V6973 V6974 V6975 V6976
## 48  TRUE  TRUE  TRUE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V6977 V6978 V6979 V6980 V6981 V6982 V6983 V6984 V6985 V6986 V6987 V6988
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V6989 V6990 V6991 V6992 V6993 V6994 V6995 V6996 V6997 V6998 V6999 V7000
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V7001 V7002 V7003 V7004 V7005 V7006 V7007 V7008 V7009 V7010 V7011 V7012
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V7013 V7014 V7015 V7016 V7017 V7018 V7019 V7020 V7021 V7022 V7023 V7024
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V7025 V7026 V7027 V7028 V7029 V7030 V7031 V7032 V7033 V7034 V7035 V7036
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V7037 V7038 V7039 V7040 V7041 V7042 V7043 V7044 V7045 V7046 V7047 V7048
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V7049 V7050 V7051 V7052 V7053 V7054 V7055 V7056 V7057 V7058 V7059 V7060
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE FALSE FALSE FALSE
##    V7061 V7062 V7063 V7064 V7065 V7066 V7067 V7068 V7069 V7070 V7071 V7072
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V7073 V7074 V7075 V7076 V7077 V7078 V7079 V7080 V7081 V7082 V7083 V7084
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V7085 V7086 V7087 V7088 V7089 V7090 V7091 V7092 V7093 V7094 V7095 V7096
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE  TRUE  TRUE  TRUE  TRUE
##    V7097 V7098 V7099 V7100 V7101 V7102 V7103 V7104 V7105 V7106 V7107 V7108
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V7109 V7110 V7111 V7112 V7113 V7114 V7115 V7116 V7117 V7118 V7119 V7120
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V7121 V7122 V7123 V7124 V7125 V7126 V7127 V7128 V7129 V7130 V7131 V7132
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V7133 V7134 V7135 V7136 V7137 V7138 V7139 V7140 V7141 V7142 V7143 V7144
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V7145 V7146 V7147 V7148 V7149 V7150 V7151 V7152 V7153 V7154 V7155 V7156
## 48  TRUE  TRUE  TRUE  TRUE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V7157 V7158 V7159 V7160 V7161 V7162 V7163 V7164 V7165 V7166 V7167 V7168
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V7169 V7170 V7171 V7172 V7173 V7174 V7175 V7176 V7177 V7178 V7179 V7180
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V7181 V7182 V7183 V7184 V7185 V7186 V7187 V7188 V7189 V7190 V7191 V7192
## 48 FALSE FALSE FALSE FALSE FALSE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V7193 V7194 V7195 V7196 V7197 V7198 V7199 V7200 V7201 V7202 V7203 V7204
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V7205 V7206 V7207 V7208 V7209 V7210 V7211 V7212 V7213 V7214 V7215 V7216
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V7217 V7218 V7219 V7220 V7221 V7222 V7223 V7224 V7225 V7226 V7227 V7228
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V7229 V7230 V7231 V7232 V7233 V7234 V7235 V7236 V7237 V7238 V7239 V7240
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE FALSE
##    V7241 V7242 V7243 V7244 V7245 V7246 V7247 V7248 V7249 V7250 V7251 V7252
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V7253 V7254 V7255 V7256 V7257 V7258 V7259 V7260 V7261 V7262 V7263 V7264
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V7265 V7266 V7267 V7268 V7269 V7270 V7271 V7272 V7273 V7274 V7275 V7276
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V7277 V7278 V7279 V7280 V7281 V7282 V7283 V7284 V7285 V7286 V7287 V7288
## 48 FALSE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V7289 V7290 V7291 V7292 V7293 V7294 V7295 V7296 V7297 V7298 V7299 V7300
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V7301 V7302 V7303 V7304 V7305 V7306 V7307 V7308 V7309 V7310 V7311 V7312
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V7313 V7314 V7315 V7316 V7317 V7318 V7319 V7320 V7321 V7322 V7323 V7324
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V7325 V7326 V7327 V7328 V7329 V7330 V7331 V7332 V7333 V7334 V7335 V7336
## 48  TRUE  TRUE  TRUE  TRUE  TRUE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V7337 V7338 V7339 V7340 V7341 V7342 V7343 V7344 V7345 V7346 V7347 V7348
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V7349 V7350 V7351 V7352 V7353 V7354 V7355 V7356 V7357 V7358 V7359 V7360
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V7361 V7362 V7363 V7364 V7365 V7366 V7367 V7368 V7369 V7370 V7371 V7372
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE  TRUE  TRUE  TRUE
##    V7373 V7374 V7375 V7376 V7377 V7378 V7379 V7380 V7381 V7382 V7383 V7384
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V7385 V7386 V7387 V7388 V7389 V7390 V7391 V7392 V7393 V7394 V7395 V7396
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V7397 V7398 V7399 V7400 V7401 V7402 V7403 V7404 V7405 V7406 V7407 V7408
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V7409 V7410 V7411 V7412 V7413 V7414 V7415 V7416 V7417 V7418 V7419 V7420
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V7421 V7422 V7423 V7424 V7425 V7426 V7427 V7428 V7429 V7430 V7431 V7432
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V7433 V7434 V7435 V7436 V7437 V7438 V7439 V7440 V7441 V7442 V7443 V7444
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V7445 V7446 V7447 V7448 V7449 V7450 V7451 V7452 V7453 V7454 V7455 V7456
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V7457 V7458 V7459 V7460 V7461 V7462 V7463 V7464 V7465 V7466 V7467 V7468
## 48 FALSE FALSE FALSE FALSE FALSE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V7469 V7470 V7471 V7472 V7473 V7474 V7475 V7476 V7477 V7478 V7479 V7480
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V7481 V7482 V7483 V7484 V7485 V7486 V7487 V7488 V7489 V7490 V7491 V7492
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V7493 V7494 V7495 V7496 V7497 V7498 V7499 V7500 V7501 V7502 V7503 V7504
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V7505 V7506 V7507 V7508 V7509 V7510 V7511 V7512 V7513 V7514 V7515 V7516
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE FALSE FALSE FALSE FALSE FALSE FALSE
##    V7517 V7518 V7519 V7520 V7521 V7522 V7523 V7524 V7525 V7526 V7527 V7528
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V7529 V7530 V7531 V7532 V7533 V7534 V7535 V7536 V7537 V7538 V7539 V7540
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V7541 V7542 V7543 V7544 V7545 V7546 V7547 V7548 V7549 V7550 V7551 V7552
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V7553 V7554 V7555 V7556 V7557 V7558 V7559 V7560 V7561 V7562 V7563 V7564
## 48 FALSE FALSE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V7565 V7566 V7567 V7568 V7569 V7570 V7571 V7572 V7573 V7574 V7575 V7576
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V7577 V7578 V7579 V7580 V7581 V7582 V7583 V7584 V7585 V7586 V7587 V7588
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V7589 V7590 V7591 V7592 V7593 V7594 V7595 V7596 V7597 V7598 V7599 V7600
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE FALSE FALSE
##    V7601 V7602 V7603 V7604 V7605 V7606 V7607 V7608 V7609 V7610 V7611 V7612
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V7613 V7614 V7615 V7616 V7617 V7618 V7619 V7620 V7621 V7622 V7623 V7624
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V7625 V7626 V7627 V7628 V7629 V7630 V7631 V7632 V7633 V7634 V7635 V7636
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V7637 V7638 V7639 V7640 V7641 V7642 V7643 V7644 V7645 V7646 V7647 V7648
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE  TRUE
##    V7649 V7650 V7651 V7652 V7653 V7654 V7655 V7656 V7657 V7658 V7659 V7660
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V7661 V7662 V7663 V7664 V7665 V7666 V7667 V7668 V7669 V7670 V7671 V7672
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V7673 V7674 V7675 V7676 V7677 V7678 V7679 V7680 V7681 V7682 V7683 V7684
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V7685 V7686 V7687 V7688 V7689 V7690 V7691 V7692 V7693 V7694 V7695 V7696
## 48  TRUE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V7697 V7698 V7699 V7700 V7701 V7702 V7703 V7704 V7705 V7706 V7707 V7708
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V7709 V7710 V7711 V7712 V7713 V7714 V7715 V7716 V7717 V7718 V7719 V7720
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V7721 V7722 V7723 V7724 V7725 V7726 V7727 V7728 V7729 V7730 V7731 V7732
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V7733 V7734 V7735 V7736 V7737 V7738 V7739 V7740 V7741 V7742 V7743 V7744
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE  TRUE  TRUE  TRUE  TRUE
##    V7745 V7746 V7747 V7748 V7749 V7750 V7751 V7752 V7753 V7754 V7755 V7756
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V7757 V7758 V7759 V7760 V7761 V7762 V7763 V7764 V7765 V7766 V7767 V7768
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V7769 V7770 V7771 V7772 V7773 V7774 V7775 V7776 V7777 V7778 V7779 V7780
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE FALSE FALSE FALSE FALSE FALSE FALSE
##    V7781 V7782 V7783 V7784 V7785 V7786 V7787 V7788 V7789 V7790 V7791 V7792
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V7793 V7794 V7795 V7796 V7797 V7798 V7799 V7800 V7801 V7802 V7803 V7804
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V7805 V7806 V7807 V7808 V7809 V7810 V7811 V7812 V7813 V7814 V7815 V7816
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V7817 V7818 V7819 V7820 V7821 V7822 V7823 V7824 V7825 V7826 V7827 V7828
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V7829 V7830 V7831 V7832 V7833 V7834 V7835 V7836 V7837 V7838 V7839 V7840
## 48 FALSE FALSE FALSE FALSE FALSE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V7841 V7842 V7843 V7844 V7845 V7846 V7847 V7848 V7849 V7850 V7851 V7852
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE
##    V7853 V7854 V7855 V7856 V7857 V7858 V7859 V7860 V7861 V7862 V7863 V7864
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE FALSE FALSE
##    V7865 V7866 V7867 V7868 V7869 V7870 V7871 V7872 V7873 V7874 V7875 V7876
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V7877 V7878 V7879 V7880 V7881 V7882 V7883 V7884 V7885 V7886 V7887 V7888
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V7889 V7890 V7891 V7892 V7893 V7894 V7895 V7896 V7897 V7898 V7899 V7900
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V7901 V7902 V7903 V7904 V7905 V7906 V7907 V7908 V7909 V7910 V7911 V7912
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V7913 V7914 V7915 V7916 V7917 V7918 V7919 V7920 V7921 V7922 V7923 V7924
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V7925 V7926 V7927 V7928 V7929 V7930 V7931 V7932 V7933 V7934 V7935 V7936
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V7937 V7938 V7939 V7940 V7941 V7942 V7943 V7944 V7945 V7946 V7947 V7948
## 48  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE  TRUE FALSE FALSE FALSE  TRUE  TRUE
##    V7949 V7950 V7951 V7952 V7953 V7954 V7955 V7956 V7957 V7958 V7959 V7960
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V7961 V7962 V7963 V7964 V7965 V7966 V7967 V7968 V7969 V7970 V7971 V7972
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V7973 V7974 V7975 V7976 V7977 V7978 V7979 V7980 V7981 V7982 V7983 V7984
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V7985 V7986 V7987 V7988 V7989 V7990 V7991 V7992 V7993 V7994 V7995 V7996
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V7997 V7998 V7999 V8000 V8001 V8002 V8003 V8004 V8005 V8006 V8007 V8008
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V8009 V8010 V8011 V8012 V8013 V8014 V8015 V8016 V8017 V8018 V8019 V8020
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V8021 V8022 V8023 V8024 V8025 V8026 V8027 V8028 V8029 V8030 V8031 V8032
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V8033 V8034 V8035 V8036 V8037 V8038 V8039 V8040 V8041 V8042 V8043 V8044
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V8045 V8046 V8047 V8048 V8049 V8050 V8051 V8052 V8053 V8054 V8055 V8056
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V8057 V8058 V8059 V8060 V8061 V8062 V8063 V8064 V8065 V8066 V8067 V8068
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V8069 V8070 V8071 V8072 V8073 V8074 V8075 V8076 V8077 V8078 V8079 V8080
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V8081 V8082 V8083 V8084 V8085 V8086 V8087 V8088 V8089 V8090 V8091 V8092
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V8093 V8094 V8095 V8096 V8097 V8098 V8099 V8100 V8101 V8102 V8103 V8104
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V8105 V8106 V8107 V8108 V8109 V8110 V8111 V8112 V8113 V8114 V8115 V8116
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V8117 V8118 V8119 V8120 V8121 V8122 V8123 V8124 V8125 V8126 V8127 V8128
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V8129 V8130 V8131 V8132 V8133 V8134 V8135 V8136 V8137 V8138 V8139 V8140
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V8141 V8142 V8143 V8144 V8145 V8146 V8147 V8148 V8149 V8150 V8151 V8152
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V8153 V8154 V8155 V8156 V8157 V8158 V8159 V8160 V8161 V8162 V8163 V8164
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V8165 V8166 V8167 V8168 V8169 V8170 V8171 V8172 V8173 V8174 V8175 V8176
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V8177 V8178 V8179 V8180 V8181 V8182 V8183 V8184 V8185 V8186 V8187 V8188
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V8189 V8190 V8191 V8192 V8193 V8194 V8195 V8196 V8197 V8198 V8199 V8200
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V8201 V8202 V8203 V8204 V8205 V8206 V8207 V8208 V8209 V8210 V8211 V8212
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V8213 V8214 V8215 V8216 V8217 V8218 V8219 V8220 V8221 V8222 V8223 V8224
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V8225 V8226 V8227 V8228 V8229 V8230 V8231 V8232 V8233 V8234 V8235 V8236
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V8237 V8238 V8239 V8240 V8241 V8242 V8243 V8244 V8245 V8246 V8247 V8248
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V8249 V8250 V8251 V8252 V8253 V8254 V8255 V8256 V8257 V8258 V8259 V8260
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V8261 V8262 V8263 V8264 V8265 V8266 V8267 V8268 V8269 V8270 V8271 V8272
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V8273 V8274 V8275 V8276 V8277 V8278 V8279 V8280 V8281 V8282 V8283 V8284
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V8285 V8286 V8287 V8288 V8289 V8290 V8291 V8292 V8293 V8294 V8295 V8296
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V8297 V8298 V8299 V8300 V8301 V8302 V8303 V8304 V8305 V8306 V8307 V8308
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V8309 V8310 V8311 V8312 V8313 V8314 V8315 V8316 V8317 V8318 V8319 V8320
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V8321 V8322 V8323 V8324 V8325 V8326 V8327 V8328 V8329 V8330 V8331 V8332
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V8333 V8334 V8335 V8336 V8337 V8338 V8339 V8340 V8341 V8342 V8343 V8344
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V8345 V8346 V8347 V8348 V8349 V8350 V8351 V8352 V8353 V8354 V8355 V8356
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V8357 V8358 V8359 V8360 V8361 V8362 V8363 V8364 V8365 V8366 V8367 V8368
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V8369 V8370 V8371 V8372 V8373 V8374 V8375 V8376 V8377 V8378 V8379 V8380
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V8381 V8382 V8383 V8384 V8385 V8386 V8387 V8388 V8389 V8390 V8391 V8392
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V8393 V8394 V8395 V8396 V8397 V8398 V8399 V8400 V8401 V8402 V8403 V8404
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V8405 V8406 V8407 V8408 V8409 V8410 V8411 V8412 V8413 V8414 V8415 V8416
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V8417 V8418 V8419 V8420 V8421 V8422 V8423 V8424 V8425 V8426 V8427 V8428
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V8429 V8430 V8431 V8432 V8433 V8434 V8435 V8436 V8437 V8438 V8439 V8440
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V8441 V8442 V8443 V8444 V8445 V8446 V8447 V8448 V8449 V8450 V8451 V8452
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE
##    V8453 V8454 V8455 V8456 V8457 V8458 V8459 V8460 V8461 V8462 V8463 V8464
## 48 FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE FALSE

2.4.1.6 Convert all images to matrices, each matrix corresponds to one patient

require(stringr)
require(data.table)


get_best_masks_img2mat <- function(path){

list_path <- as.list(list.files(path,full.names = TRUE))

list_matrices <- lapply(list_path, function(x) mat2vec(x, target = c("mask")))

best_masks <- lapply(list_matrices, function(x) get_best_mask(x))

nm <- lapply(list_path, function(x) str_remove(tools::file_path_sans_ext(basename(x)), "patient_"))

#names(best_masks) <- as.numeric(nm)

#img_df <- bind_rows(best_masks)
img_df <- data.table::rbindlist(best_masks)*1

row.names(img_df) <- nm

df_img <- tibble::rownames_to_column(img_df, "PatientID")

return(df_img)
}

img_tr <- get_best_masks_img2mat("train/images/")
img_ts <- get_best_masks_img2mat("test/images/")

output_train <- data.table::fread("output_train.csv")
output_test <- data.table::fread("output_test.csv")

train_img <-  img_tr %>%
    mutate(PatientID = as.numeric(PatientID)) %>%
    left_join(output_train, by = "PatientID") %>%
    select(PatientID,  SurvivalTime, Event, everything()) #%>%
  #setDT() # convert to data.table


test_img <-  img_ts %>%
    mutate(PatientID = as.numeric(PatientID)) %>%
    left_join(output_test, by = "PatientID") %>%
    select(PatientID, SurvivalTime, Event, everything())#%>%
  #setDT() # convert to data.table



train_img[1:5,1:10]
##   PatientID SurvivalTime Event V1 V2 V3 V4 V5 V6 V7
## 1         2          638     0  0  0  0  0  0  0  0
## 2         3          421     0  0  0  0  0  0  0  0
## 3         4          465     1  0  0  0  0  0  0  0
## 4         5         1295     1  0  0  0  0  0  0  0
## 5         7         1393     0  0  0  0  0  0  0  0
data.table::fwrite(train_img , "train_img.csv")
data.table::fwrite(test_img , "test_img.csv")
#rowSums(train_img[,-1])
rm(list = ls())
invisible(gc())

3 R xgboost with best masks

3.1 Split Train dataset into Train & Valid sets

require(rsample)

train_img <- data.table::fread("train_img.csv")
test_img <- data.table::fread("test_img.csv")

# remove PatientID column
train_img[,PatientID:= NULL]
test_img[,Event :=NULL]

set.seed(100)
train_valid_split <- rsample::initial_split(train_img, prop = 0.8)
train_valid_split
## <241/59/300>
train_img[,1:7]
##      SurvivalTime Event V1 V2 V3 V4 V5
##   1:          638     0  0  0  0  0  0
##   2:          421     0  0  0  0  0  0
##   3:          465     1  0  0  0  0  0
##   4:         1295     1  0  0  0  0  0
##   5:         1393     0  0  0  0  0  0
##  ---                                  
## 296:          528     1  0  0  0  0  0
## 297:         1503     0  0  0  0  0  0
## 298:          315     0  0  0  0  0  0
## 299:          360     1  0  0  0  0  0
## 300:          515     1  0  0  0  0  0
  • We can retrieve our training and testing sets using training() and testing() functions.
# Retrieve train and test sets
train_8 <- rsample::training(train_valid_split)
valid_2  <- rsample::testing(train_valid_split)
train_8[1:10, 1:10]
##     SurvivalTime Event V1 V2 V3 V4 V5 V6 V7 V8
##  1:          638     0  0  0  0  0  0  0  0  0
##  2:          421     0  0  0  0  0  0  0  0  0
##  3:          465     1  0  0  0  0  0  0  0  0
##  4:         1393     0  0  0  0  0  0  0  0  0
##  5:         1076     1  0  0  0  0  0  0  0  0
##  6:           87     1  0  0  0  0  0  0  0  0
##  7:           98     1  0  0  0  0  0  0  0  0
##  8:          316     0  0  0  0  0  0  0  0  0
##  9:           65     1  0  0  0  0  0  0  0  0
## 10:          476     1  0  0  0  0  0  0  0  0

3.2 Format train and test to DMatrix (R, Masks)

require(Matrix)
require(xgboost)


# the option na.pass avoids missing value in age column
options(na.action='na.pass')
train_8_sparse <- sparse.model.matrix(Event ~., data=train_8)
dtrain_8 <- xgb.DMatrix(data=train_8_sparse, label = train_8$Event)

options(na.action='na.pass')
valid_2_sparse <- sparse.model.matrix(Event ~., data=valid_2)
dvalid_2 <- xgb.DMatrix(data=valid_2_sparse, label = valid_2$Event)

3.3 Optimize features with Cross validation

Here, we can see after how many rounds, we achieved the smallest test error.

params <- list(booster = "gbtree",
              tree_method = "auto",
              objective = "binary:logistic",
              eval_metric = "auc",         #  for Binary classification error rate
              max_depth = 2,        # 6 makes training heavy, there is no correlation between features #1 is not better
              eta = 0.03,                     # learning rate
              subsample = 0.8,              # prevent overfitting
              colsample_bytree = 0.1         # specify the fraction of columns to be subsampled. # 0.5 is not better
             )


tme <- Sys.time()
cv_model <- xgb.cv(params = params,
                   data = dtrain_8,
                   nthread = parallel::detectCores(all.tests = FALSE, logical = TRUE),  #2,
                   nrounds = 25000,
                   verbose = TRUE,
                   nfold = 7,
                   print_every_n = 500,
                   early_stopping_rounds = 1000,
                   maximize = TRUE,
                   prediction = TRUE) # prediction of cv folds
## [1]  train-auc:0.682223+0.055012 test-auc:0.569252+0.097524 
## Multiple eval metrics are present. Will use test_auc for early stopping.
## Will train until test_auc hasn't improved in 1000 rounds.
## 
## [501]    train-auc:0.995171+0.000974 test-auc:0.731163+0.052885 
## [1001]   train-auc:0.999758+0.000141 test-auc:0.734070+0.056987 
## [1501]   train-auc:0.999919+0.000033 test-auc:0.731769+0.050985 
## Stopping. Best iteration:
## [607]    train-auc:0.997563+0.000629 test-auc:0.738315+0.050803
Sys.time() - tme
## Time difference of 2.453851 mins

3.4 Train the model

watchlist <- list(train = dtrain_8, eval = dvalid_2)
tme <- Sys.time()
xgboost_tree_img <- xgb.train(data = dtrain_8, 
                         params = params,
                         watchlist = watchlist,
                         nrounds = cv_model$best_iteration, # more than 12000 ~0.897
                         print_every_n = 500,
                         verbose = TRUE)
## [1]  train-auc:0.673327  eval-auc:0.594706 
## [501]    train-auc:0.992188  eval-auc:0.852941 
## [607]    train-auc:0.996197  eval-auc:0.851765
Sys.time() - tme
## Time difference of 10.61282 secs
## Predict valid_2 dataset

pred_valid_img <- predict(xgboost_tree_img, dvalid_2)

paste0('SUMMARY: '); summary(pred_valid_img)
## [1] "SUMMARY: "
##    Min. 1st Qu.  Median    Mean 3rd Qu.    Max. 
##  0.1078  0.3356  0.6286  0.5908  0.8195  0.9572
# We suppose that if Prob > 0.5 (Median), the Event is 1, else 0
pred_bin_img <- as.numeric(pred_valid_img >= 0.5)

paste("RATIO: "); table(pred_bin_img)
## [1] "RATIO: "
## pred_bin_img
##  0  1 
## 21 38
## Confusion matrix for Tree model
data.frame(prediction = as.numeric(pred_bin_img),
         label = as.numeric(valid_2$Event)) %>%
         count(prediction, label)
## # A tibble: 4 x 3
##   prediction label     n
##        <dbl> <dbl> <int>
## 1          0     0    16
## 2          0     1     5
## 3          1     0     9
## 4          1     1    29

3.5 Prediction with (R, Masks)

test_sparse <- sparse.model.matrix(PatientID ~., data=test_img)
dtest <- xgb.DMatrix(data=test_sparse, label = test_img$PatientID)
 
pred_tree_img <- predict(xgboost_tree_img, dtest)

pred_bin_xgboost_img <- as.numeric(pred_tree_img >= 0.5)


print("ratio of predicted Events in test datset: "); table(pred_bin_xgboost_img)
## [1] "ratio of predicted Events in test datset: "
## pred_bin_xgboost_img
##  0  1 
## 54 71
print("ratio of Events in train datset: "); table(train_img$Event)
## [1] "ratio of Events in train datset: "
## 
##   0   1 
## 138 162
## submission

pred_img <- data.frame(
  PatientID = test_img$PatientID,
  Event = pred_tree_img
)

output_test <- fread("output_test.csv")

submission_img <- output_test %>%
  select(PatientID, SurvivalTime) %>%
  left_join(pred_img, by = "PatientID")

fwrite(submission_img, "submission_img.csv")

## Refresh memory
rm(dtest, dtrain_8, dvalid_2, train_8, img_tr, img_ts, train_img, test_img, train_valid_split, patient2, patient_002, valid_2, train_8_sparse, masks2, ls_mask_patient_002, ls_scan_patient_002, test_sparse, valid_2_sparse)
## Warning in rm(dtest, dtrain_8, dvalid_2, train_8, img_tr, img_ts, train_img, :
## object 'img_tr' not found
## Warning in rm(dtest, dtrain_8, dvalid_2, train_8, img_tr, img_ts, train_img, :
## object 'img_ts' not found
## Warning in rm(dtest, dtrain_8, dvalid_2, train_8, img_tr, img_ts, train_img, :
## object 'patient2' not found
## Warning in rm(dtest, dtrain_8, dvalid_2, train_8, img_tr, img_ts, train_img, :
## object 'patient_002' not found
## Warning in rm(dtest, dtrain_8, dvalid_2, train_8, img_tr, img_ts, train_img, :
## object 'masks2' not found
## Warning in rm(dtest, dtrain_8, dvalid_2, train_8, img_tr, img_ts, train_img, :
## object 'ls_mask_patient_002' not found
## Warning in rm(dtest, dtrain_8, dvalid_2, train_8, img_tr, img_ts, train_img, :
## object 'ls_scan_patient_002' not found
rm(list=ls())
invisible(gc())

4 Python xgboost with best masks

import matplotlib
import matplotlib.pyplot as plt
import pandas as pd
import numpy as np
import seaborn as sns
from sklearn.model_selection import train_test_split
import xgboost

test_img = pd.read_csv("test_img.csv")
train_img = pd.read_csv("train_img.csv")

# Set weed
SEED = 1423

#test_img.pop('PatientID')
#test_img.pop('Event')
#test_img.head()
# split data into train and test portions and model
features = [feat for feat in list(train_img) if feat != 'Event']
X_train, X_valid, y_train, y_valid = train_test_split(train_img[features], 
                                                 train_img[['Event']], 
                                                test_size=0.3, 
                                                 random_state=SEED)
 


import xgboost  as xgb
xgb_params = {
    'max_depth':3, 
    'eta':0.01, 
    'silent':0, 
    'eval_metric':'auc',
    'subsample': 0.8,
    'colsample_bytree': 0.8,
    'objective':'binary:logistic',
    'seed' : 1423
}

dtrain = xgb.DMatrix(X_train, y_train, feature_names=X_train.columns.values)
dvalid = xgb.DMatrix(X_valid, y_valid, feature_names=X_valid.columns.values)


evals = [(dtrain,'train'),(dvalid,'eval')]
xgb_model_img = xgb.train ( params = xgb_params,
              dtrain = dtrain,
              num_boost_round = 5000,
              verbose_eval=200, 
              early_stopping_rounds = 500,
              evals=evals,
              maximize = True)
## [0]  train-auc:0.847485  eval-auc:0.81475
## Multiple eval metrics have been passed: 'eval-auc' will be used for early stopping.
## 
## Will train until eval-auc hasn't improved in 500 rounds.
## [200]    train-auc:0.961917  eval-auc:0.85
## [400]    train-auc:0.983145  eval-auc:0.8545
## [600]    train-auc:0.995171  eval-auc:0.858
## [800]    train-auc:0.998907  eval-auc:0.857
## [1000]   train-auc:0.999818  eval-auc:0.8555
## Stopping. Best iteration:
## [615]    train-auc:0.995809  eval-auc:0.8595
# get dataframe version of important feature for model 
xgb_img_imp = pd.DataFrame(list(xgb_model_img.get_fscore().items()),
columns = ['feature','importance']).sort_values('importance', ascending=False)
xgb_img_imp.head(10)
##           feature  importance
## 0    SurvivalTime         899
## 14      PatientID         430
## 2           V4823          82
## 94          V4620          74
## 130         V6370          73
## 231         V3701          61
## 105         V3539          53
## 104         V4639          48
## 52          V5538          48
## 39          V6018          45
# Confusion matrix
dval_predictions = xgb_model_img.predict(dvalid)

from sklearn.metrics import confusion_matrix
cm = confusion_matrix(y_valid, [1 if p > 0.5 else 0 for p in dval_predictions])

plt.figure(figsize = (6,4))
plt.ticklabel_format(style='plain', axis='y', useOffset=False)
sns.set(font_scale=1.4)
sns.heatmap(cm, annot=True, annot_kws={"size": 16}) 
plt.show()

# evaluate predictions
from sklearn.metrics import accuracy_score

predictions_img = [round(value) for value in dval_predictions]

accuracy = accuracy_score(y_valid, predictions_img)

print("Accuracy: %.2f%%" % (accuracy * 100.0))
## Accuracy: 78.89%

4.1 Prediction (Python,Masks)


dtest = xgb.DMatrix(test_img.drop('PatientID', axis=1),  feature_names=X_train.columns.values)

dtest_predictions = xgb_model_img.predict(dtest)

output_test = pd.read_csv("output_test.csv", index_col = False)



# Format predictions in DataFrame: prediction_df
#prediction_df = pd.DataFrame(PatientID = test_img['PatientID'] ,
#                             Event = predictions_img)
                             
                             
#output_test.join(submission_img.set_index('PatientID'), on='PatientID')

output_test['Event'] = dtest_predictions


output_test.to_csv("submission_img_p.csv", index=False)

## save predictions for an ensemble
#pickle.dump(Y_pred, open('xgb_train.pickle', 'wb'))
#pickle.dump(Y_test, open('xgb_test.pickle', 'wb'))

5 Exploratory Data Analysis of radiomics and clinical data

#Load dataset
radiomics <- fread("train/features/radiomics.csv", quote = "")
clinical <- fread("train/features/clinical_data.csv")

# display only 8 columns and 5 rows
head(radiomics)[,1:8]
##           V1                          V2                          V3
## 1:                                 shape                       shape
## 2:           original_shape_Compactness1 original_shape_Compactness2
## 3: PatientID                                                        
## 4:       202                 0.027815034                 0.274891585
## 5:       371                  0.02301549                 0.188210005
## 6:       246                 0.027348106                 0.265739895
##                                  V4                                    V5
## 1:                            shape                                 shape
## 2: original_shape_Maximum3DDiameter original_shape_SphericalDisproportion
## 3:                                                                       
## 4:                      48.55924217                           1.537964054
## 5:                      75.70336849                           1.744961158
## 6:                      70.43436661                           1.555420243
##                           V6                         V7
## 1:                     shape                      shape
## 2: original_shape_Sphericity original_shape_SurfaceArea
## 3:                                                     
## 4:               0.650210255                 5431.33321
## 5:               0.573078659                10369.56873
## 6:               0.642913068                10558.81869
##                                   V8
## 1:                             shape
## 2: original_shape_SurfaceVolumeRatio
## 3:                                  
## 4:                       0.275227763
## 5:                       0.240726824
## 6:                       0.200765988
head(clinical)
##    PatientID               Histology Mstage Nstage SourceDataset Tstage     age
## 1:       202          Adenocarcinoma      0      0            l2      2 66.0000
## 2:       371              large cell      0      2            l1      4 64.5722
## 3:       246 squamous cell carcinoma      0      3            l1      2 66.0452
## 4:       240                     nos      0      2            l1      3 59.3566
## 5:       284 squamous cell carcinoma      0      3            l1      4 71.0554
## 6:       348 squamous cell carcinoma      0      2            l1      2 65.0212

The radiomics features can be divided into 4 groups as follows (shown in row 1): - Group 1. First order statistics - Group 2. Shape and size based features - Group 3. Textural features - Group 4. Wavelet features

Each group can be subset into several sub-groups shown in row 2 of the radiomics dataset. To make the radiomics features numeric dataset we need to remove the two first rows and convert them to colnames.

groups <- radiomics[1:2,-1] %>%
  t() %>%
  as.data.frame() %>%
  rename("Groups" = V1, "Features" = V2) #%>%
#  remove_rownames()

head(groups)
##    Groups                              Features
## V2  shape           original_shape_Compactness1
## V3  shape           original_shape_Compactness2
## V4  shape      original_shape_Maximum3DDiameter
## V5  shape original_shape_SphericalDisproportion
## V6  shape             original_shape_Sphericity
## V7  shape            original_shape_SurfaceArea

To improve the esthetic of the dataframe, we note:

5.1 Plot the distribution of Goups and features

groups %>%
  group_by(Groups, Features) %>%
  summarise(n_Features = n()) %>%
  ggplot() +
  aes(x = Groups, y = n_Features, color = Features ) +
  geom_col() +
  theme(legend.position = "none") +
  ggtitle("Number of Features by Group")

5.1.1 Set New Colnames of radiomics

new_colnames_radiomics <- groups %>%
  mutate(Features = stringr::str_remove(Features,"original_")) %>%
  pull(Features)

new_colnames_radiomics %>% head()
## [1] "shape_Compactness1"           "shape_Compactness2"          
## [3] "shape_Maximum3DDiameter"      "shape_SphericalDisproportion"
## [5] "shape_Sphericity"             "shape_SurfaceArea"

5.1.2 Get new radiomics style

old_names <- colnames(radiomics)
new_names <- c("PatientID", new_colnames_radiomics)

new_radiomics <- radiomics[-1:-3,] %>%
  rename_at(vars(old_names), ~ new_names) %>%
  mutate_if(is.character, as.numeric) #%>%
#as.matrix()


head(new_radiomics)[,1:8]
##   PatientID shape_Compactness1 shape_Compactness2 shape_Maximum3DDiameter
## 1       202         0.02781503          0.2748916                48.55924
## 2       371         0.02301549          0.1882100                75.70337
## 3       246         0.02734811          0.2657399                70.43437
## 4       240         0.02681111          0.2554064                46.81880
## 5       284         0.02369124          0.1994242                53.79591
## 6       348         0.03098136          0.3410383                63.74951
##   shape_SphericalDisproportion shape_Sphericity shape_SurfaceArea
## 1                     1.537964        0.6502103          5431.333
## 2                     1.744961        0.5730787         10369.569
## 3                     1.555420        0.6429131         10558.819
## 4                     1.576120        0.6344693          4221.412
## 5                     1.711620        0.5842418          5295.900
## 6                     1.431305        0.6986630          8493.134
##   shape_SurfaceVolumeRatio
## 1                0.2752278
## 2                0.2407268
## 3                0.2007660
## 4                0.3238780
## 5                0.3272407
## 6                0.1976017

5.2 Glimpse correlation between features (default order)

M <- cor(new_radiomics[-1])
#corrplot(M,  method = "circle")
corrplot.mixed(M, tl.col="black", tl.pos = "lt")

5.2.1 Set the first principal component order of the features

corrplot.mixed(M, tl.col="black", tl.pos = "lt", order = "FPC")

5.2.2 Set the hierarchical clustering order of the features

corrplot.mixed(M, tl.col="black", tl.pos = "lt", order = "hclust")

  • We can return to these heatmap when we predict the most importante features using modeling.

5.3 Explore Clinical data for Train

p1 <- clinical %>%
  group_by(Histology = stringi::stri_trans_totitle(Histology)) %>% # case insensitive of adenocarcinoma and Adenocarcinoma
  group_by(Histology) %>%
  summarise(Count = n()) %>%
  ggplot()+
  aes(x = Histology, y = Count, fill= Histology) +
  geom_col()+
  geom_text(aes(label = percent(Count/sum(Count))), vjust = -0.5)+
  geom_text(aes(label = Count), vjust = -2) +
  theme(axis.text.x = element_text(color="black",size=10,hjust=.5,vjust=.5, angle=5))


p2 <- ggplot(data=clinical[!is.na(clinical$age),]) +
  aes(x= age) +
  geom_histogram(fill="blue", bins = 60)  +
  geom_vline(xintercept = c(65,70, 72 ), color = "red")
#coord_flip()

p3 <- clinical %>%
  mutate(Nstage = as.factor(Nstage)) %>%
  group_by(Mstage, Nstage, Tstage) %>%
  summarise(Count = n()) %>%
  ggplot() +
  aes(x = Tstage, y = Count, color = Nstage) +
  facet_grid(Mstage~ .) +
  geom_point(size=4, alpha = 0.8)

p4 <- clinical %>%
  group_by(SourceDataset) %>%
  summarise(Count = n()) %>%
  ggplot()+
  aes(x = "", y = Count, fill = SourceDataset) +
  geom_bar(width = 1, stat = "identity") +
  coord_polar("y", start=0) +
  theme(legend.position = "top")

grid.arrange(p1,p2,p3,p4, layout_matrix = rbind(c(1),c(2, 3, 4)), nrow = 2)

  • The most frequente cases is Adenocarcinoma followed by Aquamous Cell Carcinoma.

  • NOS: not otherwise specified

  • It seems NOS and Nsclc Nos corespond to the same category

  • Nan is not available ?

  • The density plot shows that the must frequent cases are 65, 70, 72 years old.

_ The most frequent Nstage class is also 0, followed by 2, 3, and 1.

  • The third plots shows that the most cases are in Mstage == 0. We can focus only in this class.

  • There are two sources of dataset.

5.4 Explore output_train and output_test

output_train <- fread("output_train.csv")
output_test <- fread("output_test.csv")
head(output_train)
##    PatientID SurvivalTime Event
## 1:       202         1378     0
## 2:       371          379     1
## 3:       246          573     1
## 4:       240          959     0
## 5:       284         2119     0
## 6:       348          706     1
head(output_test)
##    PatientID SurvivalTime Event
## 1:        13     788.4177   NaN
## 2:       155     427.6501   NaN
## 3:       404     173.5872   NaN
## 4:       407     389.8780   NaN
## 5:         9    1580.7672   NaN
## 6:        49     472.5234   NaN
  • The goal is the fill Event variable in output_test by 0 or 1.

6 Preprocessing of Train and Test dataset

The output of this section is to clean and unify variables and merge clinical, radimoics, and output_train dataset.

6.1 Train wrangling

# Convert character variables to numeric 
new_clinical <- clinical %>%
                mutate(Histology = stringi::stri_trans_totitle(Histology)) %>% 
                mutate_if(is.character, as.factor) %>%
                mutate_if(is.factor, as.numeric) 
                #mutate(Histo = as.numeric(as.factor(Histology))) %>%
                #mutate(Source = as.numeric(as.factor(SourceDataset))) %>%
                #select(everything(), - Histology, -SourceDataset)

train_features <- new_clinical %>%
  mutate_if(is.character, as.factor) %>%
  left_join(y = output_train, by = "PatientID") %>%
  left_join(y = new_radiomics, by = "PatientID") %>%
  select(PatientID, Event, everything()) %>%
  setDT()


fwrite(train_features, "train_features.csv")
train_features[,1:10] %>% head()
##    PatientID Event Histology Mstage Nstage SourceDataset Tstage     age
## 1:       202     0         1      0      0             2      2 66.0000
## 2:       371     1         2      0      2             1      4 64.5722
## 3:       246     1         6      0      3             1      2 66.0452
## 4:       240     0         4      0      2             1      3 59.3566
## 5:       284     0         6      0      3             1      4 71.0554
## 6:       348     1         6      0      2             1      2 65.0212
##    SurvivalTime shape_Compactness1
## 1:         1378         0.02781503
## 2:          379         0.02301549
## 3:          573         0.02734811
## 4:          959         0.02681111
## 5:         2119         0.02369124
## 6:          706         0.03098136

6.1.1 Explore missing value in train

require(DataExplorer)
DataExplorer::plot_missing(train_features)

  • There are 18 missing age from 300.

6.2 Test wrangling

radiomics_test <- fread("test/features/radiomics.csv", quote = "")
clinical_test <- fread("test/features/clinical_data.csv")
output_test <- fread("output_test.csv")

6.2.1 Transform radiomics test dataset

groups_test <- radiomics_test[1:2,-1] %>%
  t() %>%
  as.data.frame() %>%
  rename("Groups" = V1, "Features" = V2)

new_colnames_radiomics_test <- groups_test %>%
  mutate(Features = stringr::str_remove(Features,"original_")) %>%
  pull(Features)

old_names_test <- colnames(radiomics_test)
new_names_test <- c("PatientID", new_colnames_radiomics_test)

new_radiomics_test <- radiomics_test[-1:-3,] %>%
  rename_at(vars(old_names_test), ~ new_names_test) %>%
  mutate_if(is.character, as.numeric) #%>%
#as.matrix()


head(new_radiomics_test)[,1:8]
##   PatientID shape_Compactness1 shape_Compactness2 shape_Maximum3DDiameter
## 1        13         0.02888522         0.29645143               106.90182
## 2       155         0.03194837         0.36266005                18.81489
## 3       404         0.01599883         0.09094503               105.08092
## 4       407         0.03135766         0.34937318                46.96807
## 5         9         0.01781454         0.11275905                56.54202
## 6        49         0.03816202         0.51744596                20.12461
##   shape_SphericalDisproportion shape_Sphericity shape_SurfaceArea
## 1                     1.499738        0.6667830        29085.5414
## 2                     1.402276        0.7131265          629.4436
## 3                     2.223687        0.4497036        12509.2654
## 4                     1.419832        0.7043089         4067.6574
## 5                     2.069901        0.4831149         7093.3657
## 6                     1.245599        0.8028264          844.2344
##   shape_SurfaceVolumeRatio
## 1                0.1145278
## 2                0.7038788
## 3                0.3152977
## 4                0.2821040
## 5                0.3760316
## 6                0.5088176

6.2.2 Transform clinical test dataset

new_clinical_test <- clinical_test %>%
                mutate(Histology = stringi::stri_trans_totitle(Histology)) %>% 
                mutate_if(is.character, as.factor) %>%
                mutate_if(is.factor, as.numeric) 

6.2.3 Explre clinical data for Test

p1 <- new_clinical_test %>%
  group_by(Histology = stringi::stri_trans_totitle(Histology)) %>% # case insensitive of adenocarcinoma and Adenocarcinoma
  group_by(Histology) %>%
  summarise(Count = n()) %>%
  ggplot()+
  aes(x = Histology, y = Count, fill= Histology) +
  geom_col()+
  geom_text(aes(label = percent(Count/sum(Count))), vjust = -0.5)+
  geom_text(aes(label = Count), vjust = -2) +
  theme(axis.text.x = element_text(color="black",size=10,hjust=.5,vjust=.5, angle=5))


p2 <- ggplot(data=new_clinical_test[!is.na(new_clinical_test$age),]) +
  aes(x= age) +
  geom_histogram(fill="blue", bins = 60)  +
  geom_vline(xintercept = c(65,70, 72 ), color = "red")
#coord_flip()

p3 <- new_clinical_test %>%
  mutate(Nstage = as.factor(Nstage)) %>%
  group_by(Mstage, Nstage, Tstage) %>%
  summarise(Count = n()) %>%
  ggplot() +
  aes(x = Tstage, y = Count, color = Nstage) +
  facet_grid(Mstage~ .) +
  geom_point(size=4, alpha = 0.8)

p4 <- new_clinical_test %>%
  mutate(SourceDataset = as.factor(SourceDataset)) %>%
  group_by(SourceDataset) %>%
  summarise(Count = n()) %>%
  ggplot()+
  aes(x = "", y = Count, fill = SourceDataset) +
  geom_bar(width = 1, stat = "identity") +
  coord_polar("y", start=0) +
  theme(legend.position = "top")


grid.arrange(p1,p2,p3,p4, layout_matrix = rbind(c(1),c(2, 3, 4)), nrow = 2)

  • There is fourth class for Nstage in test data taht is not available in train.
  • The average age in test seems tobe younger in test data than in train.
  • The Histology seems to have the same distribution

6.2.4 Merge clinical, radiomics, and output_test dataset

test_features <- new_clinical_test %>%
  mutate_if(is.character, as.factor) %>%
  left_join(y = output_test, by = "PatientID") %>%
  left_join(y = new_radiomics_test, by = "PatientID") %>%
  select(PatientID, Event, everything()) %>%
  setDT() # convert to data.table


fwrite(test_features, "test_features.csv")
test_features[,1:10] %>% head()
##    PatientID Event Histology Mstage Nstage SourceDataset Tstage     age
## 1:        13   NaN         4      0      0             1      4 44.3970
## 2:       155   NaN         1      0      3             1      1 63.3183
## 3:       404   NaN         2      0      2             1      2 64.7255
## 4:       407   NaN         4      0      0             1      2 65.3635
## 5:         9   NaN         1      0      0             2      2 50.0000
## 6:        49   NaN         6      0      0             1      2 86.1410
##    SurvivalTime shape_Compactness1
## 1:     788.4177         0.02888522
## 2:     427.6501         0.03194837
## 3:     173.5872         0.01599883
## 4:     389.8780         0.03135766
## 5:    1580.7672         0.01781454
## 6:     472.5234         0.03816202

6.2.4.1 Explore missing value in test

library(DataExplorer)
DataExplorer::plot_missing(test_features)

  • There are 4 missing age from 125.
#refresh memoru
rm(list = ls())
invisible(gc())

7 Pull Radiomics, Clinical data and the best Masks

8 R Xgboost modeling with Radiomics, Clinical and Masks

8.1 Scaling Train and Test dataset

trainremoveCols <- c('PatientID','Event')
testremoveCols <- c('PatientID', 'Event')

Event <- train$Event
PatientID <- test$PatientID

train[,(trainremoveCols) := NULL]
test[,(testremoveCols) := NULL]

# Do scaling
dt <- rbind(train, test)
scale.cols <- colnames(dt)
dt[, (scale.cols) := lapply(.SD, scale), .SDcols = scale.cols]
train <- cbind(Event, head(dt,nrow(train)))
test  <- cbind(PatientID, tail(dt, nrow(test)))
rm(dt)
invisible(gc())

8.2 Split Train dataset into Train & Valid sets

require(rsample)

set.seed(100)
train_valid_split <- rsample::initial_split(train, prop = 0.8)
train_valid_split
## <241/59/300>
  • We can retrieve our training and testing sets using training() and testing() functions.
# Retrieve train and test sets
train_8 <- rsample::training(train_valid_split)
valid_2  <- rsample::testing(train_valid_split)
train_8[1:10, 1:10]
##     Event  Histology     Mstage     Nstage SourceDataset     Tstage        age
##  1:     0 -1.0235305 -0.1207812 -0.8392371     1.4175490 -0.1386218 -0.2504355
##  2:     1 -0.5217999 -0.1207812  0.8392371    -0.7037831  1.7024495 -0.3972837
##  3:     1  1.4851227 -0.1207812  1.6784742    -0.7037831 -0.1386218 -0.2457867
##  4:     0  1.4851227 -0.1207812  1.6784742    -0.7037831  1.7024495  0.2695086
##  5:     1  1.4851227 -0.1207812  0.8392371    -0.7037831 -0.1386218 -0.3511044
##  6:     1 -1.0235305 -0.1207812 -0.8392371     1.4175490 -1.0591575  0.1609615
##  7:     1 -1.0235305 -0.1207812 -0.8392371    -0.7037831  1.7024495  0.6186818
##  8:     1 -1.0235305 -0.1207812  0.8392371    -0.7037831  1.7024495 -1.5788159
##  9:     0 -0.5217999 -0.1207812  0.8392371    -0.7037831 -0.1386218 -2.0372357
## 10:     1  1.4851227 -0.1207812  0.8392371     1.4175490 -0.1386218  1.1894541
##     SurvivalTime shape_Compactness1 shape_Compactness2
##  1:    0.7487397          0.3273910          0.2206572
##  2:   -0.7068626         -0.4508739         -0.5451008
##  3:   -0.4241931          0.2516768          0.1398098
##  4:    1.8284207         -0.3412982         -0.4460329
##  5:   -0.2304042          0.8408228          0.8050072
##  6:    0.7356262         -0.8618943         -0.8911634
##  7:   -0.9720474          0.3619137          0.2579747
##  8:   -0.9735045         -1.1924952         -1.1402475
##  9:    0.3815607          0.9177119          0.8979347
## 10:   -0.7345467         -0.0934834         -0.2114097

8.3 Format train and test to DMatrix

require(Matrix)
require(xgboost)

# the option na.pass avoids missing value in age column
options(na.action='na.pass')
train_8_sparse <- sparse.model.matrix(Event ~., data=train_8)
dtrain_8 <- xgb.DMatrix(data=train_8_sparse, label = train_8$Event)

options(na.action='na.pass')
valid_2_sparse <- sparse.model.matrix(Event ~., data=valid_2)
dvalid_2 <- xgb.DMatrix(data=valid_2_sparse, label = valid_2$Event)

8.4 Optimize features with Cross validation

Here, we can see after how many rounds, we achieved the smallest test error.

params <- list(booster = "gbtree",
              tree_method = "auto",
              objective = "binary:logistic",
              eval_metric = "auc",         #  for Binary classification error rate
              max_depth = 2,        # 6 makes training heavy, there is no correlation between features #1 is not better
              eta = 0.03,                     # learning rate
              subsample = 0.8,              # prevent overfitting
              colsample_bytree = 0.1         # specify the fraction of columns to be subsampled. # 0.5 is not better
             )


tme <- Sys.time()
cv_model <- xgb.cv(params = params,
                   data = dtrain_8,
                   nthread = parallel::detectCores(all.tests = FALSE, logical = TRUE),  #2,
                   nrounds = 25000,
                   verbose = TRUE,
                   nfold = 7,
                   print_every_n = 500,
                   early_stopping_rounds = 1000,
                   maximize = TRUE,
                   prediction = TRUE) # prediction of cv folds
## [1]  train-auc:0.718051+0.043889 test-auc:0.619140+0.061644 
## Multiple eval metrics are present. Will use test_auc for early stopping.
## Will train until test_auc hasn't improved in 1000 rounds.
## 
## [501]    train-auc:0.999879+0.000097 test-auc:0.749731+0.075059 
## [1001]   train-auc:1.000000+0.000000 test-auc:0.753281+0.076136 
## [1501]   train-auc:1.000000+0.000000 test-auc:0.761041+0.071028 
## [2001]   train-auc:1.000000+0.000000 test-auc:0.757107+0.069419 
## [2501]   train-auc:1.000000+0.000000 test-auc:0.756282+0.072293 
## Stopping. Best iteration:
## [1561]   train-auc:1.000000+0.000000 test-auc:0.762555+0.071525
Sys.time() - tme
## Time difference of 13.96329 mins

8.5 Train the model

watchlist <- list(train = dtrain_8, eval = dvalid_2)
tme <- Sys.time()
xgboost_tree <- xgb.train(data = dtrain_8, 
                         params = params,
                         watchlist = watchlist,
                         nrounds = cv_model$best_iteration, # more than 12000 ~0.897
                         print_every_n = 500,
                         verbose = TRUE)
## [1]  train-auc:0.688073  eval-auc:0.543103 
## [501]    train-auc:0.999444  eval-auc:0.842529 
## [1001]   train-auc:1.000000  eval-auc:0.865517 
## [1501]   train-auc:1.000000  eval-auc:0.864368 
## [1561]   train-auc:1.000000  eval-auc:0.864368
Sys.time() - tme
## Time difference of 1.478552 mins

8.6 Predict valid_2 dataset

pred_valid <- predict(xgboost_tree, dvalid_2)

summary(pred_valid)
##     Min.  1st Qu.   Median     Mean  3rd Qu.     Max. 
## 0.002944 0.064929 0.499536 0.434123 0.738628 0.998919
  • We suppose that if Prob > 0.5 (Median), the Event is 1, else 0

8.7 Transform propability to binary classification

pred_bin <- as.numeric(pred_valid >= 0.5)
table(pred_bin)
## pred_bin
##  0  1 
## 30 29

8.8 Confusion matrix for Tree model

data.frame(prediction = as.numeric(pred_bin),
         label = as.numeric(valid_2$Event)) %>%
         count(prediction, label)
## # A tibble: 4 x 3
##   prediction label     n
##        <dbl> <dbl> <int>
## 1          0     0    23
## 2          0     1     7
## 3          1     0     6
## 4          1     1    23

8.9 Extract the most important features from tree xgboost model

8.9.1 List the most important features

features <- colnames(train_8)
importance_matrix_tree <- xgb.importance(features, model = xgboost_tree)
importance_matrix_tree
##                           Feature         Gain        Cover    Frequency
##   1:                 SurvivalTime 1.518797e-01 6.403812e-02 0.0451349512
##   2:             glcm_Correlation 2.110504e-02 2.498131e-02 0.0242685416
##   3:                          age 2.073723e-02 2.646743e-02 0.0215468360
##   4:      glcm_MaximumProbability 2.024262e-02 1.942644e-02 0.0176910864
##   5: glrlm_GrayLevelNonUniformity 1.741357e-02 1.557860e-02 0.0151961896
##  ---                                                                    
## 884:                        V3718 1.371769e-05 3.366216e-05 0.0002268088
## 885:                        V3809 9.926757e-06 3.106980e-05 0.0002268088
## 886:                        V6129 8.795324e-06 3.016645e-05 0.0002268088
## 887:                        V4196 2.153746e-06 3.204075e-05 0.0002268088
## 888:                        V3894 8.746036e-07 4.291979e-05 0.0002268088
  • Survival Time is the most important feature, followed by age, and 8 texture description.

8.9.2 Plot the most important features (Tree model)

library(Ckmeans.1d.dp)
xgb.ggplot.importance(importance_matrix_tree[1:30,]) +
ggplot2::theme_minimal()

8.10 Test Xgboost Prediction

8.10.1 Load test data and format to DMatrix

test_sparse <- sparse.model.matrix(PatientID ~., data=test)
 dtest <- xgb.DMatrix(data=test_sparse, label = test$PatientID)
pred_tree <- predict(xgboost_tree, dtest)

pred_bin_xgboost <- as.numeric(pred_tree >= 0.5)

print("ratio of predicted Events in test datset: "); table(pred_bin_xgboost)
## [1] "ratio of predicted Events in test datset: "
## pred_bin_xgboost
##  0  1 
## 62 63
print("ratio of Events in train datset: "); table(train$Event)
## [1] "ratio of Events in train datset: "
## 
##   0   1 
## 138 162

8.11 submission

pred <- data.frame(
  PatientID = PatientID,
  Event = pred_tree
)

output_test <- fread("output_test.csv")

submission <- output_test %>%
  select(PatientID, SurvivalTime) %>%
  left_join(pred, by = "PatientID")

fwrite(submission, "submission_fea_img_r.csv")

submission %>% head(10)
##    PatientID SurvivalTime      Event
## 1         13     788.4177 0.08963599
## 2        155     427.6501 0.89507192
## 3        404     173.5872 0.92702597
## 4        407     389.8780 0.92249340
## 5          9    1580.7672 0.01042625
## 6         49     472.5234 0.94254625
## 7         55    1970.9725 0.09879023
## 8        200     530.4248 0.27764136
## 9        170    1067.4630 0.06317536
## 10       387     378.3248 0.64743817
rm(list = ls())
invisible(gc())

9 Python xgboost Prediction (Python, Radiomics, Clinical, Masks datasets)

import matplotlib
import matplotlib.pyplot as plt
import pandas as pd
import numpy as np
import seaborn as sns
from sklearn.model_selection import train_test_split
import pandas as pd

test = pd.read_csv("test_fea_img.csv")
train = pd.read_csv("train_fea_img.csv")
train.head()
##    PatientID  Event  Histology  Mstage  ...  V8461  V8462  V8463  V8464
## 0        202      0          1       0  ...      0      0      0      0
## 1        371      1          2       0  ...      0      0      0      0
## 2        246      1          6       0  ...      0      0      0      0
## 3        240      0          4       0  ...      0      0      0      0
## 4        284      0          6       0  ...      0      0      0      0
## 
## [5 rows x 8526 columns]
# impute age to median
train['age'] = train['age'].fillna(train['age'].mean())
test['age'] = test['age'].fillna(test['age'].mean())

test.head()
##    PatientID  Event  Histology  Mstage  ...  V8461  V8462  V8463  V8464
## 0         13    NaN          4       0  ...      0      0      0      0
## 1        155    NaN          1       0  ...      0      0      0      0
## 2        404    NaN          2       0  ...      0      0      0      0
## 3        407    NaN          4       0  ...      0      0      0      0
## 4          9    NaN          1       0  ...      0      0      0      0
## 
## [5 rows x 8526 columns]
# Set weed
SEED = 1423
train = train.drop('PatientID', axis=1)
test = test.drop(['PatientID', 'Event'], axis = 1)

def prepare_data_for_model(raw_dataframe, target_columns, drop_first = True, make_na_col = False):
    # dummy all categorical fields 
    dataframe_dummy = pd.get_dummies(raw_dataframe, columns=target_columns, 
                                     drop_first=drop_first, 
                                     dummy_na=make_na_col)
    return (dataframe_dummy)

# create dummy features 
train_xgboost = prepare_data_for_model(train, target_columns=['Histology', 'Tstage']) #, 'Nstage' : 3 classes
train_xgboost = train_xgboost.dropna() 

# create dummy features for test
test_xgboost = prepare_data_for_model(test, target_columns=['Histology', 'Tstage']) #, 'Nstage' : 4 classes
test_xgboost = test_xgboost.dropna() 

train_xgboost.head()
##    Event  Mstage  Nstage  SourceDataset  ...  Tstage_2  Tstage_3  Tstage_4  Tstage_5
## 0      0       0       0              2  ...         1         0         0         0
## 1      1       0       2              1  ...         0         0         1         0
## 2      1       0       3              1  ...         1         0         0         0
## 3      0       0       2              1  ...         0         1         0         0
## 4      0       0       3              1  ...         0         0         1         0
## 
## [5 rows x 8532 columns]
test_xgboost.head()
##    Mstage  Nstage  SourceDataset  ...  Tstage_3  Tstage_4  Tstage_5
## 0       0       0              1  ...         0         1         0
## 1       0       3              1  ...         0         0         0
## 2       0       2              1  ...         0         0         0
## 3       0       0              1  ...         0         0         0
## 4       0       0              2  ...         0         0         0
## 
## [5 rows x 8531 columns]
# split data into train and test portions and model
features = [feat for feat in list(train_xgboost) if feat != 'Event']
X_train, X_valid, y_train, y_valid = train_test_split(train_xgboost[features], 
                                                 train_xgboost[['Event']], 
                                                test_size=0.3, 
                                                 random_state=SEED)
 


import xgboost  as xgb
xgb_params = {
    'max_depth':3, 
    'eta':0.01, 
    'silent':0, 
    'eval_metric':'auc',
    'subsample': 0.8,
    'colsample_bytree': 0.8,
    'objective':'binary:logistic',
    'seed' : 1423
}

dtrain = xgb.DMatrix(X_train, y_train, feature_names=X_train.columns.values)
dvalid = xgb.DMatrix(X_valid, y_valid, feature_names=X_valid.columns.values)


evals = [(dtrain,'train'),(dvalid,'eval')]
xgb_model_fea = xgb.train ( params = xgb_params,
              dtrain = dtrain,
              num_boost_round = 5000,
              verbose_eval=200, 
              early_stopping_rounds = 500,
              evals=evals,
              maximize = True)
## [0]  train-auc:0.865973  eval-auc:0.842391
## Multiple eval metrics have been passed: 'eval-auc' will be used for early stopping.
## 
## Will train until eval-auc hasn't improved in 500 rounds.
## [200]    train-auc:0.986367  eval-auc:0.837945
## [400]    train-auc:0.999171  eval-auc:0.839427
## Stopping. Best iteration:
## [3]  train-auc:0.928519  eval-auc:0.845109
# get dataframe version of important feature for model 
xgb_fea_imp=pd.DataFrame(list(xgb_model_fea.get_fscore().items()),
columns=['feature','importance']).sort_values('importance', ascending=False)
xgb_fea_imp.head(10)
##                                feature  importance
## 0                         SurvivalTime         490
## 3              glcm_MaximumProbability         101
## 2                                  age          99
## 6                    glcm_ClusterShade          81
## 19   glrlm_LongRunLowGrayLevelEmphasis          67
## 32                    glcm_Correlation          61
## 8   glrlm_ShortRunLowGrayLevelEmphasis          59
## 70                   firstorder_Median          57
## 33              glcm_ClusterProminence          56
## 13       glrlm_LowGrayLevelRunEmphasis          52
# Confusion matrix
dval_predictions = xgb_model_fea.predict(dvalid)
dval_predictions
## array([0.57204545, 0.43744403, 0.9311881 , 0.7522133 , 0.8992599 ,
##        0.3539847 , 0.09718167, 0.9264896 , 0.14718276, 0.87452906,
##        0.94777375, 0.63869435, 0.7442223 , 0.38633287, 0.87755233,
##        0.21450518, 0.06625096, 0.57183176, 0.3708731 , 0.8277049 ,
##        0.56574184, 0.7192741 , 0.87758064, 0.79865056, 0.64621705,
##        0.4627555 , 0.7391719 , 0.7600696 , 0.78452367, 0.12071649,
##        0.9022546 , 0.20555827, 0.168312  , 0.67517155, 0.9508647 ,
##        0.8093434 , 0.8452269 , 0.6145063 , 0.63965183, 0.9538013 ,
##        0.562915  , 0.7859778 , 0.24059518, 0.10105682, 0.16814843,
##        0.54898244, 0.78313303, 0.32389146, 0.4743114 , 0.8112633 ,
##        0.43723333, 0.8193109 , 0.95397776, 0.09396309, 0.96094865,
##        0.48283294, 0.460934  , 0.8384317 , 0.14103228, 0.9584772 ,
##        0.71298057, 0.84272087, 0.7511111 , 0.9227    , 0.76937944,
##        0.931369  , 0.51061213, 0.9658959 , 0.288277  , 0.64524114,
##        0.87713796, 0.76477253, 0.26440895, 0.9560654 , 0.8963974 ,
##        0.8629451 , 0.26993212, 0.9068126 , 0.7414487 , 0.5396428 ,
##        0.698584  , 0.08270516, 0.9700313 , 0.50430924, 0.93455327,
##        0.78382224, 0.83545995, 0.82780474, 0.92010456, 0.14274152],
##       dtype=float32)
from sklearn.metrics import confusion_matrix
cm = confusion_matrix(y_valid, [1 if p > 0.5 else 0 for p in dval_predictions])

plt.figure(figsize = (6,4))
plt.ticklabel_format(style='plain', axis='y', useOffset=False)
sns.set(font_scale=1.4)
sns.heatmap(cm, annot=True, annot_kws={"size": 16}) 
plt.show()

# evaluate predictions
from sklearn.metrics import accuracy_score
predictions = [round(value) for value in dval_predictions]
accuracy = accuracy_score(y_valid, predictions)
print("Accuracy: %.2f%%" % (accuracy * 100.0))
## Accuracy: 72.22%

9.1 Prediction with Python, features, Masks

dtest = xgb.DMatrix(test_xgboost,  feature_names = X_train.columns.values)

dtest_predictions = xgb_model_fea.predict(dtest)

output_test = pd.read_csv('output_test.csv',  index_col = False)

output_test['Event'] = dtest_predictions

output_test.to_csv("submission_fea_img_p.csv", index=False)

output_test.head()
##    PatientID  SurvivalTime     Event
## 0         13    788.417673  0.271252
## 1        155    427.650092  0.838568
## 2        404    173.587222  0.945236
## 3        407    389.877973  0.898751
## 4          9   1580.767244  0.086320